[ Please see updates at the end of this entry]
If one uses an algorithm as a way to find the sparsest solution to an underdetermined linear system of equations, then Yoram Bresler is right [1] when he points out that Gaspard de Prony might have been first in putting a stake in what we now call compressive sensing. What is so exceptional here is that the algorithm was found in 1795. If you consider the theoretical justification for some cases of compressive sensing were published in 2004-2006 (Donoho, Tao, Candes, Romberg), then it is probably one of the largest time gap between the use of an algorithm within an island of knowledge [6-8] and some theoretical justification for it. Here this affirmation is to be taken in wider sense: Is there an algorithm and theoretical justification that connects the sparsity of the solution to an underdetermined system of linear equations and the size of said system. In effect, Larry Wasserman thinks it might win the biggest gap award. What's interesting is that historically, it might even precede the Least Squares solution technique [7] depending on whether you are a Gauss (1795) or a Legendre (1805) fan.
If one uses an algorithm as a way to find the sparsest solution to an underdetermined linear system of equations, then Yoram Bresler is right [1] when he points out that Gaspard de Prony might have been first in putting a stake in what we now call compressive sensing. What is so exceptional here is that the algorithm was found in 1795. If you consider the theoretical justification for some cases of compressive sensing were published in 2004-2006 (Donoho, Tao, Candes, Romberg), then it is probably one of the largest time gap between the use of an algorithm within an island of knowledge [6-8] and some theoretical justification for it. Here this affirmation is to be taken in wider sense: Is there an algorithm and theoretical justification that connects the sparsity of the solution to an underdetermined system of linear equations and the size of said system. In effect, Larry Wasserman thinks it might win the biggest gap award. What's interesting is that historically, it might even precede the Least Squares solution technique [7] depending on whether you are a Gauss (1795) or a Legendre (1805) fan.
Ben Recht provides a nice exposition of Prony's method with a compressive sensing mind [4].
Let us note what Ben mentions at the every end of that lecture:
The main drawbacks of Prony's method is that it is not numerically stable for large D. Moreover, it is not stable if x is nearly sparse. Also, the algorithm is very speci fic to the structure of the Fourier transform and does not generalize to other linear encodings
Indeed, we are now a little bit beyond m = 2s with the Donoho-Tanner phase transition [6] and we can deal with noise thanks to the 2004 results [2,3].
But just like this little music that won't stop in the back of one's mind, one just wonders: what if an algorithm had been found in 1795 that had been numerically stable ? Would it have changed the world earlier like the Least Squares did in 1800's and the FFT did in the 1960's ? [7]
Update 1:
Peyman Milanfar commented:
While we are minding the gap, I will point out my dissertation work in which I proved (using Prony!) that one could reconstruct a plane polygon from a finite number of its Radon transform projections. At the time I referred to such objects as being "finitely parameterized" which in today's lingo would be "sparsely represented". Not long after, the notion of signals with finite rate of innovation came out of EPFL, followed later by compressed sensing etc. and the rest is history. So I am claiming my rightful place in history as having slightly shortened said gap. :-)P. Milanfar, G.C. Verghese, W.C. Karl, A.S. Willsky, “Reconstructing Polygons from Moments with Connections to Array Processing”, IEEE Transactions on Signal Processing, vol. 43, no. 2, pp. 432-443, February 1995
Update 2:
" I would like to add that the first appearance of Prony's method in compressed sensing I am aware of are talks given by Daniel Potts (TU Chemnitz) and Stefan Kunis (U Osnabrück) which were probably in 2009. There they explicitely compared the thresholds and recovery guarantees by Prony and Basis Pursuit. However, I could not find slides or preprints from these days (although they have more recent papers on that topic)....They have a recent preprint "A sparse Prony FFT" here http://www.tu-chemnitz.de/mathematik/preprint/2013/PREPRINT.php?year=2013&num=03"
We describe the application of Prony-like reconstruction methods to the problem of the sparse Fast Fourier transform (sFFT) [5]. In particular, we adapt both important parts of the sFFT, quasi random sampling and filtering techniques, to Prony-like methods
I note from the paper that the Sparse Prony FFT might fare better than some other sFFT implementations as it scales only as K^(3/2)..
Of related interest, there is also:
- NONLINEAR APPROXIMATION BY SUMS OF EXPONENTIALS AND TRANSLATES by THOMAS PETER, DANIEL POTTS, AND MANFRED TASCHE
- Parameter estimation for nonincreasing exponential sums by Prony-like methods by Daniel Potts Manfred Taschez
[1] Yoram Bresler, The Invention of Compressive Sensing and Recent Results: - From Spectrum-Blind Sampling and Image Compression on the Fly to New Solutions with Realistic Performance Guarantees
[2] E. J. Candès, J. Romberg and T. Tao. Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inform. Theory, 52 489-509.
[3] Compressed Sensing, :Information Theory, IEEE Transactions on Information Theory, April 2006, Donoho, D.L.
[4] CS838 Topics In Optimization: Convex Geometry in High-Dimensional Data Analysis, Lecturer: Benjamin Recht, February 4, 2010 Scribes: Matt Malloy, Zhiting Xu
[5] Convex Geometry in High-Dimensional Data Analysis, CS838 Topics In Optimization, Ben Recht
[7] Who invented the great numerical algorithms?, Nick Trefethen
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
1 comment:
While we are minding the gap, I will point out my dissertation work in which I proved (using Prony!) that one could reconstruct a plane polygon from a finite number of its Radon transform projections. At the time I referred to such objects as being "finitely parameterized" which in today's lingo would be "sparsely represented". Not long after, the notion of signals with finite rate of innovation came out of EPFL, followed later by compressed sensing etc. and the rest is history. So I am claiming my rightful place in history as having slightly shortened said gap. :-)
http://users.soe.ucsc.edu/~milanfar/publications/journal/moment_SP.pdf
P. Milanfar, G.C. Verghese, W.C. Karl, A.S. Willsky, “Reconstructing Polygons from Moments with Connections to Array Processing”, IEEE Transactions on Signal Processing, vol. 43, no. 2, pp. 432-443, February 1995
Post a Comment