## Friday, February 25, 2011

### CS: Sparse Signal Recovery with Temporally Correlated Source Vectors Using Sparse Bayesian Learning, Dagstuhl presentations, JOINC 2011

Zhilin Zhang just sent me the following:
Hi, Igor,
I am a reader of your Nuit Blanche, and have benefited a lot from it. Recently I have some works on the joint sparsity recovery, or called the multiple measurement vector (MMV) model. As you know, the MMV model is an extension of the basic CS model (single measurement vector model). Most algorithms for the basic CS model can be extended for the MMV model by imposing $\ell_p$ norm on the rows of the solution matrix, e.g. $\ell_2$ and $\ell_{\infity}$. However, these norm operations are blind to the structures of each row of the solution matrix. In other words, they ignore (not exploit) the correlations of elements in each row. In my paper, I derived two algorithms exploiting these correlations and achieved much better performance. There are also some interesting phenomena in the paper (see Fig.7 and Fig.8). And I want to know if there exist any theories that can explain these.
Best regards,

I then asked Zhilin if his codes would be released at some point in time. Zhilin kindly answered with:

... I will release the codes on my homepage once the paper is accepted. But anybody can email to me to ask for the codes before that time...

We address the sparse signal recovery problem in the context of multiple measurement vectors (MMV) when elements in each nonzero row of the solution matrix are temporally correlated. Existing algorithms do not consider such temporal correlations and thus their performance degrades significantly with the correlations. In this work, we propose a block sparse Bayesian learning framework which models the temporal correlations. In this framework we derive two sparse Bayesian learning (SBL) algorithms, which have superior recovery performance compared to existing algorithms, especially in the presence of high temporal correlations. Furthermore, our algorithms are better at handling highly underdetermined problems and require less row-sparsity on the solution matrix. We also provide analysis of the global and local minima of their cost function, and show that the SBL cost function has the very desirable property that the global minimum is at the sparsest solution to the MMV problem. Extensive experiments also provide some interesting results that motivate future theoretical research on the MMV model.

Mário Figueiredo also sent me the following:
Hi Igor,
This workshop (which took place a couple of weeks ago in Dagstuhl, Germany), may be of interest to Nuit Blanche: http://www.dagstuhl.de/en/program/calendar/semhp/?semnr=11051
The slides of many of the talks are available here: http://www.dagstuhl.de/mat/index.en.phtml?11051
Best regards,
Mario.

The presentations related to compressed sensing are listed below:

Thanks Mário

Last year, there was a Journées Imagerie Optique Non Conventionnelle at ESPCI that showed some folks from Saint Etienne )or waa it Clermont) who were doing the same type of holographic work as above (except they did not use the sparsity of the scene). I was struck by how CS could be used for that purpose, well it looks like this is already undertaken. Now the question ishow do you change the sensor as a result.. This year, the meeting will take place at ESPCI again on March 28-29. The featured presentation will be:

« Mesure de la matrice de transmission d'un milieu complexe: application à l'imagerie », Sylvain Gigan, Institut Langevin, Paris.
« Détection et localisation de défauts dans des environnements bruités », Josselin Garnier, Université Paris VII

we mentioned these two research areas before.