Here is a potentially interesting AMP solver for analysis compressive sensing:
Generalized Approximate Message Passing for the Cosparse Analysis Model by Mark Borgerding, Philip Schniter
In "synthesis'' compressive sensing (CS), one seeks a sparse coefficient vector that approximately matches a set of linear measurements, whereas in "analysis'' CS, one instead seeks a (nonsparse) signal vector matches a set of linear measurements while being sparse in a given linear transformation (e.g., the finite difference operator in the case of total variation regularization). The Approximate Message Passing (AMP) algorithm, first proposed by Donoho, Maleki, and Montanari, has established itself as an effective means of solving the synthesis CS problem but not the analysis CS problem. In this paper, we propose a novel interpretation of the generalized AMP algorithm, first proposed by Rangan, that provides a direct vehicle for solving the analysis CS problem. In addition, we propose a novel form of soft thresholding, based on the limit of the MMSE denoiser for a Bernoulli-Uniform prior with increasing support, that successfully mimics ℓ0regularization. Extensive empirical experiments demonstrate the advantages of our proposed "Generalized AMP for Analysis'' (GrAMPA) algorithm, in both accuracy and runtime, over several existing approaches based on greedy analysis pursuit, Douglas-Rachford splitting, and iteratively-reweighted-ℓ1.The GrAMPA page is here.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.