We mentioned how interesting it was a year ago (see Tree_approx : A Fast Approximation Algorithm for Tree-Sparse Recovery - implementation -) and, for what it's worth, a follow-up to that work received the best paper award at ICML . This is interesting on many levels, one of them being that the paper was not presented at SPARS which happens to take place at the same time in Cambridge. Another interesting aspect of this is that it is pretty much a sparsity based paper that cares about full reconstruction as opposed to a simple and coarser classification task that is the usual fare of the machine learning literature. Finally, a connection to the Prize-Collecting Steiner Tree Problem through the developement of an algorithm called the Prize-Collecting Steiner Forest Problem is likely to have some important application beyond the reconstruction problem of this paper.Without further ado:
A Nearly-Linear Time Framework for Graph-Structured Sparsity by Chinmay Hegde, Piotr Indyk, Ludwig Schmidt
We introduce a framework for sparsity structures defined via graphs. Our approach is flexible and generalizes several previously studied sparsity models. Moreover, we provide efficient projection algorithms for our sparsity model that run in nearly-linear time. In the context of sparse recovery, we show that our framework achieves an information-theoretically optimal sample complexity for a wide range of parameters. We complement our theoretical analysis with experiments demonstrating that our algorithms also improve on prior work in practice.
related:
Slides: Nearly Linear-Time Algorithms for Structured Sparsity, Piotr Indyk
but also
Approximation Algorithms for Model-Based Compressive Sensing by Chinmay Hegde, Piotr Indyk, Ludwig Schmidt
Slides: Nearly Linear-Time Algorithms for Structured Sparsity, Piotr Indyk
but also
Approximation Algorithms for Model-Based Compressive Sensing by Chinmay Hegde, Piotr Indyk, Ludwig Schmidt
Compressive Sensing (CS) stipulates that a sparse signal can be recovered from a small number of linear measurements, and that this recovery can be performed efficiently in polynomial time. The framework of model-based compressive sensing (model-CS) leverages additional structure in the signal and prescribes new recovery schemes that can reduce the number of measurements even further. However, model-CS requires an algorithm that solves the model-projection problem: given a query signal, produce the signal in the model that is also closest to the query signal. Often, this optimization can be computationally very expensive. Moreover, an approximation algorithm is not sufficient for this optimization task. As a result, the model-projection problem poses a fundamental obstacle for extending model-CS to many interesting models.
In this paper, we introduce a new framework that we call approximation-tolerant model-based compressive sensing. This framework includes a range of algorithms for sparse recovery that require only approximate solutions for the model-projection problem. In essence, our work removes the aforementioned obstacle to model-based compressive sensing, thereby extending the applicability of model-CS to a much wider class of models. We instantiate this new framework for the Constrained Earth Mover Distance (CEMD) model, which is particularly useful for signal ensembles where the positions of the nonzero coefficients do not change significantly as a function of spatial (or temporal) location. We develop novel approximation algorithms for both the maximization and the minimization versions of the model-projection problem via graph optimization techniques. Leveraging these algorithms into our framework results in a nearly sample-optimal sparse recovery scheme for the CEMD model.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment