The dynamics of message passing on dense graphs, with applications to compressed sensing by Mohsen Bayati and Andrea Montanari. The abstract reads:
Approximate message passing' algorithms proved to be extremely effective in reconstructing sparse signals from a small number of incoherent linear measurements. Extensive numerical experiments further showed that their dynamics is accurately tracked by a simple one-dimensional iteration termed `state evolution'. In this paper we provide the first rigorous foundation to state evolution. We prove that indeed it holds asymptotically in the large system limit for sensing matrices with iid gaussian entries.also on the same subject, the 15th Annual LIDS student conference at MIT will feature Message-passing For Compressed Sensing by Venkat Chandar. The abstract of the talk is:
While our focus is on message passing algorithms for compressed sensing, the analysis extends beyond this setting, to a general class of algorithms on dense graphs. In this context, state evolution plays the role that density evolution has for sparse graphs.
We propose a message-passing algorithm to recover a non-negative vector x from given linear measurements y=Ax, where A is an m-by-n matrix. The algorithm is very similar to the belief propagation algorithm(s) utilized in the context of decoding low density parity check codes. We establish that when A corresponds to the adjacency matrix of a bipartite graph with sufficient expansion, the algorithm produces a reconstruction r(x) of x satisfying
,where x(k) is the best k-sparse approximation of x. The algorithm performs computation in total, and the number of measurements required is . In the special case when x is k-sparse, the algorithm recovers x exactly in time . Conceptually, this work provides a rigorous connection between the theory of message-passing algorithms and compressed sensing that has been alluded to in many of the recent prior works.
Finally, the NIPS'09 videos are out. The ones that I mentioned before can be found below:
- Multi-Label Prediction via Compressed Sensing by Daniel Hsu, UC
- Making Very Large-Scale Linear Algebraic Computations Possible Via Randomization by Gunnar Martinsson, University of Colorado
- Sparse Methods for Machine Learning: Theory and Algorithms by Francis R. Bach, INRIA
- Machine Learning for Brain-Computer Interfaces by Jeremy Hill, Max Planck Institute for Biological Cybernetics, Max Planck Institute
- An Efficient P300-based Brain-Computer Interface with Minimal Calibration Time by Fabien Lotte, Institute for Infocomm Research
- Towards Brain Computer Interfacing: Algorithms for on-line Differentiation of Neuroelectric Activities by Klaus-Robert Müller, Fraunhofer Institute Computer Architecture and Software Technology
Image Credit: NASA/JPL/Space Science Institute, Titan as seen by Cassini the day before yesterday on January 20th, 2010.
No comments:
Post a Comment