Pages

Wednesday, January 02, 2013

Deterministic matrices matching the compressed sensing phase transitions of Gaussian random matrices

Here is a good way to start the year. Back in 2008, Anna mentioned to me that even though we were using random matrices in compressive sensing, those were ultimately generated by an algorithm that was pseudo-random in the first place and so there was really the prospect to get deterministic matrices to work in compressive sensing. At that time, deterministic matrices had a much higher bound on the required number of minimum measurements than for random matrices (I believe m scaled as k^2) and so were not really sought after. The problematic with the bound on random matrices has been that it has steered a lot of interest into studying RIP based argument that , for most practitioners, was really too strict. For practitioners, the issues was really whether the measurement matrices got near the Donoho-Tanner phase transition not if it fulfilled the much stricter RIP condition. Some practioners also noticed that their own design or measurement matrices fulfilled the Donoho-Tanner phase transition even though it was not random. Well , thanks to today's paper, it looks like we have now a good reason to believe that a large set of deterministic matrices fulfill that transition. The next question now becomes how do we implement the seeded matrices [1] of Krzakala et al into this deterministic framework. 


In compressed sensing, one takes samples of an N-dimensional vector using an matrix A, obtaining undersampled measurements . For random matrices with independent standard Gaussian entries, it is known that, when is k-sparse, there is a precisely determined phase transition: for a certain region in the (,)-phase diagram, convex optimization typically finds the sparsest solution, whereas outside that region, it typically fails. It has been shown empirically that the same property—with the same phase transition location—holds for a wide range of non-Gaussian random matrix ensembles. We report extensive experiments showing that the Gaussian phase transition also describes numerous deterministic matrices, including Spikes and Sines, Spikes and Noiselets, Paley Frames, Delsarte-Goethals Frames, Chirp Sensing Matrices, and Grassmannian Frames. Namely, for each of these deterministic matrices in turn, for a typical k-sparse object, we observe that convex optimization is successful over a region of the phase diagram that coincides with the region known for Gaussian random matrices. Our experiments considered coefficients constrained to for four different sets , and the results establish our finding for each of the four associated phase transitions.





Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment