Pages

Tuesday, April 08, 2008

Compressed Sensing: Update on MMA14 and Bayesian Multi-Task Compressive Sensing with Dirichlet Process Priors

In yesterday's Monday Morning Algorithm (Monday Morning Algorithm 14: A Comparison of the Reconstruction Capability of CoSaMP, OMP, Subspace Pursuit and Reweighted Lp in Compressive Sensing) the script used a function that is part of the signal processing toolbox. In the spirit of trying to make these examples as software agnostic as possible, I have changed the "offending" line from

phi=(kron(dftmtx(c).',dftmtx(l)));

to

phi = (kron((fft(eye(c))).',fft(eye(l))));

I have also added a feature to the solvers so that the tolerance for estimating a good reconstruction from a bad one is parametric (parameter tol1). A tolerance of 10E-3 makes the solution from the reweighted scheme look competitive with regards to the greedy solvers.

A tolerance of 10E-8 takes away that advantage and relegate the reweighted algorithms in the less appealing category (which is also the reason why one wants to get rid of the wrong part of the wavelet tree, i.e. too many small oscillations)


The new version is included in the new zip file.

Here is a new paper on Bayesian Compressive Sensing by the group at Duke. Yuting Qi, Dehong Liu, David Dunson and Lawrence Carin just released Bayesian Multi-Task Compressive Sensing with Dirichlet Process Priors. The Abstract reads:


Compressive sensing (CS) is an emerging field that, under appropriate conditions, can significantly reduce the number of measurements required for a given signal. Specically, if the m-dimensional signal u is sparse in an orthonormal basis represented by the m x m matrix , then one may infer u based on projection measurements. If where  are the sparse coefficients in basis t then the CS measurements are represented by where v is an n-dimensional vector and is an n x m projection matrix. There are several ways in which the matrix  may be constituted, and one typically inverts for the signal u by solving under the constraint that is sparse (with this often performed with l1 regularization). In many applications, one is interested in multiple signals that may be measured in multiple CS-type measurements, where here each corresponds to a sensing “task”. It is possible to improve the CS performance (e.g., requiring fewer total CS measurements) by exploiting the statistical inter-relationships of the associated CS measurements, jointly inverting for the M underlying signals. In this paper we propose a novel multi-task compressive sensing framework based on a Bayesian formalism, where a sparseness prior is adopted. The key challenge is that not all of the measured are necessarily appropriate for sharing when performing inversion, and one must therefore infer what sharing of data across the M “tasks” is appropriate. Toward this end, a Dirichlet process (DP) prior is employed, which provides a principled means of inferring the appropriate sharing mechanisms (i.e., it infers how best to cluster the M CS measurements, with CS inversion effectively performed separately within each cluster). The posteriors of the sparse signals as well as the sharing mechanism are inferred among all CS tasks. A variational Bayesian (VB) inference algorithm is employed to estimate the full posterior on the model parameters, and an even more efficient simplified VB DP algorithm is also considered.
I can't wait to use the algorithm that implements the idea.

No comments:

Post a Comment