Pages

Saturday, October 06, 2012

Sunday Morning Insight: Ditching L_1

Thomas Strohmer wrote a very nice overview of the field in Measure what should be measured: Progress and Challenges in Compressive Sensing. go read it, I'll wait..... I agree with everything Thomas says and more, especially when he engages readers to come and read Nuit Blanche. More seriously, in light of what we have seen I am increasingly wondering if it is time to ditch the l_1 or convex programming argument as a way to recover sparse vectors. There will be a time where this sentence will look more normal:
At the mathematical heart of compressive sensing lies the discovery that it is possible to reconstruct a sparse signal exactly from an underdetermined linear system of equations and that this can be done in a computationally efficient manner via convex programming
When you consider the following solvers and approaches listed below which go beyond the Donoho-Tanner phase transition or the spinodal line:
none of them make a case based on l_1 nor convex programming/ Yet they all cross over the spinodal line. I also agree with Thomas that "There are also intriguing connections to statistical physics" especially I'd love to see someone making a connection between the measurement matrices of Krzakala et al and the structured sparsity arguments of Montanari et al 

I also note that Thomas make similar arguments like the ones developed in Predicting the Future: Randomness and Parsimony but while I like the following idea very much, I somehow am a bit skeptical about this tidbit:

....To revolutionize technology we will need to develop hardware and algorithms via an integrated, transdisciplinary approach. Hence, in the future when we design sensors, processors, and other devices, we may no longer speak only about hardware and software, where each of these two components is developed essentially separately. Instead, we may have to add a third category, which we could call hybridware or mathematical sensing 5, where the physical device and the mathematical algorithm are completely intertwined and codesigned right from the beginning.

Why ? Because in  Predicting the Future: The Steamrollers, we noticed that any time a technology becomes too specialized or too dependent on something that was not following economies of scale, it was eventually wiped out by Moore's law. So hybridware might be a product on the edge but it is likely to remain so even in the future. 

In order to beat the Steamrollers, you really need to be far out there. Following Mario Andretti's motto " If everything seems under control, you’re just not going fast enough!” we ought to be serious about sensors that are really far out :)

To finish, here is a video of  a lecture by Yoshiyuki Kabashima on A Statistical Mechanical Approach to Compressed Sensing. The end of the video has two examples, one of which is a coarse structured sparsity argument. Enjoy!

No comments:

Post a Comment