Hi Igor,Here is a quick summary of our 2 papers with Bob Sturm and Mark Plumbley (the ICASSP proceedings are not on IEEExplore yet):In these 2 articles we analyze whether for a given dictionary, sparser signals are always easier to recover by greedy algorithms. As a background, Joel Tropp's approximations of the Exact Recovery Condition (ERC) using the coherence or cumulative coherence scale monotonically with the size of the support. However, Mark Plumbley proved in an earlier article that the ERC itself does not. In this work we proved three main results:1) (IEEE Trans. Info. Theory) Plumbley's proof used non-normalized dictionaries, but the formulation of the ERC that gives recovery guarantees differs in that case. So while Plumbley's result is mathematically correct, it cannot be extended into a result on sparse recovery.2) (ICASSP) There can be a support I that is uniformly recoverable by OMP and a sub-support I' included in I that is not,3) (ICASSP) but for any dictionary Phi and any s less than spark(Phi)/2 (the only range that matters since sparse representations are not all unique for s greater or equal to spark(Phi)/2), if all supports of size s are uniformly recoverable by OMP, then all supports of smaller size also are.....Best wishes,

Here are the papers:

On Theorem 10 in "On Polar Polytopes and the Recovery of Sparse Representations" Bob L. Sturm, Boris Mailhé , Mark D. Plumbley

Résumé : It is shown that Theorem 10 (Non-Nestedness of ERC) in [Plumbley, IEEE Trans. Info. Theory, vol. 53, pp. 3188, Sep. 2007] neglects the derivations of the exact recovery conditions (ERCs) of constrained 1-minimization (BP) and orthogonal matching pursuit (OMP). This means that it does not reflect the recovery properties of these algorithms. Furthermore, an ERC of BP more general than that in [Tropp, IEEE Trans. Info. Theory, vol. 50, pp. 2231, Oct. 2004] is shown.

Boris Mailhé 1, Bob L. Sturm, Mark D. Plumbley

Résumé : In this work, we study the links between the recovery proper- ties of sparse signals for Orthogonal Matching Pursuit (OMP) and the whole General MP class over nested supports. We show that the optimality of those algorithms is not locally nested: there is a dictionary and supports I and J with J included in I such that OMP will recover all signals of sup- port I, but not all signals of support J. We also show that the optimality of OMP is globally nested: if OMP can recover all s-sparse signals, then it can recover all s′-sparse signals with s′ smaller than s. We also provide a tighter version of Donoho and Elad's spark theorem, which allows us to com- plete Tropp's proof that sparse representation algorithms can only be optimal for all s-sparse signals if s is strictly lower than half the spark of the dictionary.

On Theorem 10 in “On Polar Polytopes and the Recovery of Sparse Representations”Sturm, B.L. ; Mailhe, B. ; Plumbley, M.D.

It is shown that Theorem 10 (Non-Nestedness of ERC) in [Plumbley, IEEE Trans. Inf. Theory, vol. 53, pp. 3188–3195, Sep. 2007] neglects the derivations of the exact recovery conditions (ERCs) of constrained $ell_{1}$ -minimization (BP) and orthogonal matching pursuit. This means that it does not reflect the recovery properties of these algorithms. Furthermore, an ERC of BP more general than that in [Tropp, IEEE Trans. Inf. Theory, vol. 50, pp. 2231–2242, Oct. 2004] is shown.

**Join the CompressiveSensing subreddit or the Google+ Community and post there !**

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

## No comments:

Post a Comment