Pages

Wednesday, May 11, 2011

CS: Two talks.

.Here are two talks, I am mentioning them for several reasons including getting a picture of things to come down the preprint pipeline.

At Stanford, May 11 (today)
Precise asymptotics in high-dimensional sparse regression and compressed sensing.
Andrei Broder
Yahoo! Research

I will describe recent work giving precise asymptotic results on mean squared error and other characteristics, of a range of estimators in a range of high-dimensional problems from sparse regression and compressed sensing; these include results for LASSO, group LASSO, and nonconvex sparsity penalty methods. A key applicaton of such precise formulas is their use in deriving precise optimality results which were not known previously, and to our knowledge not available by other methods.

and
Precise asymptotics in high-dimensional sparse regression and compressed sensing.
David Donoho

I will describe recent work giving precise asymptotic results on mean squared error and other characteristics, of a range of estimators in a range of high-dimensional problems from sparse regression and compressed sensing; these include results for LASSO, group LASSO, and nonconvex sparsity penalty methods. A key applicaton of such precise formulas is their use in deriving precise optimality results which were not known previously, and to our knowledge not available by other methods.

You may notice that the two authors are both listed at the same time and at the same location!

Berkeley, May 9th, Title: A New Characterization of Compressed Sensing Limits
Speaker: Galen Reeves

Abstract:

The fact that sparse vectors can be recovered from a small number of linear measurements has important and exciting implications for engineering and statistics. However, despite the vast amount of recent work in the field of compressed sensing, a sharp characterization between what can and cannot be recovered in the presence of noise remains an open problem in general. In this talk, we provide such a characterization for the task of sparsity pattern estimation (also known as support recovery). Using tools from information theory, we find a sharp separation into two problem regimes -- one in which the problem is fundamentally noise-limited, and a more interesting one in which the problem is limited by the behavior of the sparse components themselves. This analysis allows us to identify settings where existing computationally efficient algorithms, such as the LASSO, are optimal as well as other settings where these algorithms are highly suboptimal. Furthermore, we show how additional structure can make a key difference, analogous to the role of diversity in wireless communications.

On the engineering side, our analysis answers key engineering questions related to compressed sensing: Is it better to increase SNR or take more measurements? Is a given algorithm good enough? What accuracy can be attained? On the mathematical side, our results validate certain phase transitions predicted by the powerful, but heuristic, replica method from statistical physics.

No comments:

Post a Comment