Today we have a new solver EM-BG-AMP by Jeremy Vila and Phil Schniter that produces recovery rates that are much better than the Donoho-Tanner phase transition. In some cases like the one above, its results are equivalent to the recent solver of Florent Krzakala, Marc Mézard, François Sausset,Yifan Sun and Lenka Zdeborová (Statistical physics-based reconstruction in compressed sensing,) The main difference is that the KMSSZ solver is tailored to a particular measurement matrix where the EM-BG-AMP solver is not. The accompanying paper to the EM-BG-AMP solver is: Expectation-Maximization Bernoulli-Gaussian Approximate Message Passing by Jeremy Vila and Philip Schniter. The abstract reads:
The approximate message passing (AMP) algorithm originally proposed by Donoho, Maleki, and Montanari yields a computationally attractive solution to the usual ℓ1-regularized least-squares problem faced in compressed sensing, whose solution is known to be robust to the signal distribution. When the signal is drawn i.i.d from a marginal distribution that is not least-favorable, better performance can be attained using a Bayesian variation of AMP. The latter, however, assumes that the distribution is perfectly known. In this paper, we navigate the space between these two extremes by modeling the signal as i.i.d Bernoulli-Gaussian (BG) with unknown prior sparsity, mean, and variance, and the noise as zero-mean Gaussian with unknownvariance, and we simultaneously reconstruct the signal while learning the prior signal and noise parameters. To accomplish this task, we embed the BG-AMP algorithm within an expectation maximization (EM) framework. Numerical experiments conﬁrm the excellent performance of our proposed EM-BG-AMP on a range of signal types.12
From the website:
EM-BG-AMP: An Algorithm for Sparse Reconstruction
The Expectation-Maximization Bernoulli-Gaussian Approximate Message Passing (EM-BG-AMP) is an algorithm designed to recover a signal from (possibly) noisy measurements . One particularly interesting case is when the measurements are undersampled (i.e. ). With sufficient signal sparsity and a well-conditioned mixing matrix, signal recovery is possible.
EM-BG-AMP assumes a sparsity promoting i.i.d. Bernoulli-Gaussian prior: , and i.i.d. additive Gaussian noise: . It then uses the Generalized Approximate Message Passing (GAMP) algorithm to evaluate the means (MMSE estimates) and variances of the posterior . After running BG-AMP, we then find the Expectation-Maximization (EM) updates of the parameters and and repeat BG-AMP until convergence. (See publications below for details.)
The EM-BG-AMP MATLAB function attempts to recover a signal through measurements observed from the above linear model. Unlike some other compressive sensing algorithms (i.e. LASSO), EM-BG-AMP does not require any tuning parameters. EM-BG-AMP does, however, have some optional inputs to improve computation time/accuracy. Currently, EM-BG-AMP supports both real and complex signals. Examples of the function's usage can be seen in the Examples section.
Advantages of EM-BG-AMPThere are plenty of compressive-sensing algorithms available. Why use EM-BG-AMP? Here are a few advantages of our approach.
- Excellent recovery performance over a wide range of signal/matrix types.
- Very good complexity scaling with problem dimensions
- No required tuning parameters.
Please contact the following authors regarding any questions:
The theory behind EM-BG-AMP is detailed in:
- J. Vila and P. Schniter "Expectation-Maximization Bernoulli-Gaussian Approximate Message Passing," to appear in Proc. Asilomar Conf. on Signals, Systems, and Computers (Pacific Grove, CA), Nov. 2011
- J. Vila and P. Schniter "An Empirical-Bayes Approach to Compressive Sensing via Approximate Message Passing," presented at Sensing and Analysis of High Dimensional Data (SAHD) Workshop(Durham, NC), Jul. 2011
This work has been supported in part by NSF-I/UCRC grant IIP-0968910, by NSF grant CCF-1018368, and by DARPA/ONR grant N66001-10-1-4090.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.