Pages

Thursday, November 15, 2012

Mind Maps of Compressive Sensing / A compressive sensing framework for seismic source parameter estimation

While looking for the preprint pointed out to me by Laurent Duval on seismic and compreissve sensing (and that is featured below, thanks Laurent)  I found the following interesting mindmaps produced by the good folks at SLIM who are also leading the path of compressive sensing and seismic activities. I generally use freemind, but it looks like I could upgrade. 




In particular here is a mind map for compressive sensing. I don't agree with one or two arrows but it looks fantastic





and here is the paper



Simultaneous estimation of origin time, location and moment tensor of seismic events is critical for automatic, continuous, real-time monitoring systems. Recent studies have shown that such systems can be implemented via waveform fitting methods based on pre-computed catalogues of Green’s functions. However, limitations exist in the number and length of the recorded traces, and the size of the monitored volume that these methods can handle without compromising real-time response. This study presents numerical tests using a novel waveform fitting method based on compressive sensing, a field of applied mathematics that provides conditions for sampling and recovery of signals that admit a sparse representation under a known base or dictionary. Compressive sensing techniques enable us to determine source parameters in a compressed space, where the dimensions of the variables involved in the inversion are significantly reduced. Results using a hypothetical monitoring network with a dense number of recording stations show that a compressed catalogue of Green’s functions with 0.004 per cent of its original size recovers the exact source parameters in more than 50 per cent of the tests. The gains in processing time in this case drop from an estimated 90 days to browse a solution in the uncompressed catalogue to 41.57 s to obtain an estimation using the compressed catalogue. For simultaneous events, the compressive sensing approach does not appear to influence the estimation results beyond the limitations presented by the uncompressed case. The main concern in the use of compressive sensing is detectability issues observed when the amount of compression is beyond a minimum value that is identifiable through numerical experiments. Tests using real data from the 2002 June 18 Caborn Indiana earthquake show that the presence of noise and inaccurate Green’s functions require a smaller amount of compression to reproduce the solution obtained with the uncompressed catalogue. In this case, numerical simulation enables the assessment of the amount of compression that provides a reasonable rate of detectability. Overall, the numerical experiments demonstrate the effectiveness of our compressed domain inversion method in the real-time monitoring of seismic sources with dense networks of receivers. As an added benefit of the compression process, the size of the monitored volume can also be increased under specific restrictions while maintaining the real-time response.








Join our Reddit Experiment, Join the CompressiveSensing subreddit and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment