Hey Igor,Just giving you a head's up on a conference paper to appear next month:J. Tan, D. Carmon, and D. Baron, "Optimal Estimation with Arbitrary Error Metrics in Compressed Sensing," to appear in IEEE Statistical Signal Processing workshop, Ann Arbor, MI, August 2012.
Here is the paper: Optimal Estimation with Arbitrary Error Metrics in Compressed Sensing by Jin Tan, Danielle Carmon, and Dror Baron. The abstract reads:
Noisy compressed sensing deals with the estimation of a system input from its noise-corrupted linear measurements. The performance of the estimation is usually quantiﬁed by some standard error metric such as squared error or support error. In this paper, we consider a noisy compressed sensing problem with any arbitrary error metric. We propose a simple, fast, and general algorithm that estimates the original signal by minimizing an arbitrary error metric deﬁned by the user. We verify that, owing to the decoupling principle, our algorithm is optimal, and we describe a general method to compute the fundamental information-theoretic performance limit for any well-deﬁned error metric. We provide an example where the metric is absolute error and give the theoretical performance limit for it. The experimental results show that our algorithm outperforms methods such as relaxed belief propagation, and reaches the suggested theoretical limit for our example error metric.
and the attendant Matlab software can be found here:
Thanks Dror !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.