Phase transitions, AMP and hyperspectral, I already like it. Phil Schniter just sent me the following:
Hi Igor,[Our] paper ...is now public: http://arxiv.org/abs/1310.2806The main objective is to solve a special case of the compressed sensing problem: that where the signal is non-negative and also satisfies given linear equality constraints, such as simplex-constrained signals. Since you enjoy phase transition curves, you may find interest in Figures 1-2.A by-product of our work on this problem is a method to tune AMP-based LASSO (or variants on LASSO) using the EM algorithm. Figure 4 shows (among other things) a comparison of our EM-tuned non-negative LASSO with that of a convex solver (TFOCS) tuned using an oracle: we are basically matching the mean-squared-error performance at all points (while running 1-2 orders of magnitude faster).Cheers,Phil
Thank you Phil ! Here is the paper: An Empirical-Bayes Approach to Recovering Linearly Constrained Non-Negative Sparse Signals by Jeremy Vila, Phil Schniter
We propose two novel approaches to the recovery of an (approximately) sparse signal from noisy linear measurements in the case that the signal is apriori known to be non-negative and obey given linear equality constraints, such as simplex signals. This problem arises in, e.g., hyperspectral imaging, portfolio optimization, density estimation, and certain cases of compressive imaging. Our first approach solves a linearly constrained non-negative version of LASSO using the max-sum version of the generalized approximate message passing (GAMP) algorithm, where we consider both quadratic and absolute loss, and where we propose a novel approach to tuning the LASSO regularization parameter via the expectation maximization (EM) algorithm. Our second approach is based on the sum-product version of the GAMP algorithm, where we propose the use of a Bernoulli non-negative Gaussian-mixture signal prior and a Laplacian likelihood, and propose an EM-based approach to learning the underlying statistical parameters. In both approaches, the linear equality constraints are enforced by augmenting GAMP's generalized-linear observation model with noiseless pseudo-measurements. Extensive numerical experiments demonstrate the state-of-the-art performance of our proposed approaches.
- Adaptive Compressive Noncoherent Change Detection,
- AMP Tools for Large-Scale Inference,
- A Primer on Compressive Sensing,
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment