Pages

Monday, November 02, 2009

CS: Real vs. complex null space properties, Non-Parametric Bayesian Dictionary Learning, Coordinate Descent Optimization code




Today, we have two papers and two codes:

Real vs. complex null space properties for sparse vector recovery by Simon Foucart, Remi Gribonval. The abstract reads:
We identify and solve an overlooked problem about the characterization of underdetermined systems of linear equations for which sparse solutions have minimal `1-norm. This characterization is known as the null space property. When the system has real coefficients, sparse solutions can be considered either as real or complex vectors, leading to two seemingly distinct null space properties. We prove that the two properties actually coincide by establishing a link with a problem about convex polygons in the real plane. Incidentally, we also show the equivalence between stable null space properties which account for the stable reconstruction by `1-minimization of vectors that are not exactly sparse.

Non-Parametric Bayesian Dictionary Learning for Sparse Image Representations by Mingyuan Zhou, Haojun Chen, John Paisley, Lu Ren, Guillermo Sapiro and Lawrence Carin. The abstract reads:
Non-parametric Bayesian techniques are considered for learning dictionaries for sparse image representations, with applications in denoising, inpainting and compressive sensing (CS). The beta process is employed as a prior for learning the dictionary, and this non-parametric method naturally infers an appropriate dictionary size. The Dirichlet process and a probit stick-breaking process are also considered to exploit structure within an image. The proposed method can learn a sparse dictionary in situ; training images may be exploited if available, but they are not required. Further, the noise variance need not be known, and can be nonstationary. Another virtue of the proposed method is that sequential inference can be readily employed, thereby allowing scaling to large images. Several example results are presented, using both Gibbs and variational Bayesian inference, with comparisons to other state-of-the-art approaches.
From the BCS webpage:

BPFA image denoising and inpainting: The package includes the inference update equations and Matlab codes for image denoising and inpainting via the non-parametric Bayesian dictionary learning approach.

Download: BPFA.zip (Last Updated: Oct. 30, 2009)


Back in March I mentioned the following paper Coordinate Descent Optimization for $\ell^1$ Minimization with Application to Compressed Sensing; a Greedy Algorithm by Yingying Li and Stanley Osher. The code is now available on Matlab Central.


Credit: OnOrbit, X-prize, Unreasonable rocket.

No comments:

Post a Comment