So it continues, Florent Krzakala, Jean Barbier and Christophe Schülke have a version of their seeded 'magic matrices' expanded to include Hadamard and Fourier blocks instead of random matrices.

We study compressed sensing of real and complex sparse signals using an optimal and computationally efficient procedure based on the combined use of approximate message-passing and spatially-coupled measurement matrices with fast Fourier and Hadamard operators. We compare the performance of our algorithm, that uses deterministic matrices, with the performance of approximate message-passing using purely random Gaussian matrices, for which the asymptotic behavior is exactly given by the density evolution. We show empirically that after proper randomization, the underlying structure of the operators does not significantly affect the performances of the reconstruction, thus allowing a fast and memory efficient reconstruction up to the information theoretic limit.The results are as impressive as two years ago when they broke the Donoho-Tanner phase transition.

**Join the CompressiveSensing subreddit or the Google+ Community and post there !**

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

## 2 comments:

This is fascinating. In Monte Carlo sampling for rendering and computational finance we have been switching from random sampling to low-discrepancy sequences. The LDS gives better convergence than random sampling on a variety of functions which are locally smooth or have bounded variations. I wonder if the work on t-m-s net can benefit the design of the measurement matrices.

or inversely how could those matrices be applied to rendering.

Igor.

Post a Comment