As I was re-reading an old paper on Maximum Entropy and the Nearly Black Object by Donoho, Johnstone, Hoch and Stern. I could not escape contemplating the Games of Thronesque situation seen in signal reconstruction in the past twenty years or so.
For those who have never heard about it, Games of Thrones is a TV series on HBO that features the power play that goes on in several kingdoms. Each episode offers epic battles among and between kingdoms, the saga also features throat cutting murderers, cruel dirty lying bastards and sometimes it even features bad people. All in all, greed and power grab shape the landscape of the series which does a good job at providing a genuine point of view for each actors of this fresco. In turn, I am also always surprised at how fast they join the bandwagon of the powerful with particular ruthlessness.
1. Islands of knowledge.
In season 1, there is this particular situation where some tribe doesn't overtake the rest of the larger kingdom because there is a sea between them and none of the warriors are sea faring. Each kingdom can live in some autarkic manner as the territories seem like large enough islands. In inverse problems reconstruction, there is a similar situation where communities thrive using their own techniques. From the outside looking in, the techniques seem extremely ad-hoc and not well understood. However, they work very well for each particular universe. The 1990 paper by Donoho et al  reveals a collision between the folks on the l_1 island/house, those on the Maximum Entropy (ME) island/house and those on the Least Squares kingdom.
The "animosity" between the tribes show right through these cryptic words in 
"...This paper is being written against a background of some controversy [25,26,23]. "
which features as a reference none other than a column in the Times.
Even though, I have heard stories about compressive sensing being born before 2004 [2,3], let's see what exactly happened:
In 1990, the authors of  are simply hinting that something is working in the sampling process because the original signal is sparse. In 1996, sparse coding shows a wavelet-like structure in the visual cortex  that introduces the world to the reality that most signals are sparse in some bases. Only in 2004 [4,5] provided the justification for the sampling strategies on how to successfully grab these sparse signals. While some sensors were already sampling in these "incoherent" bases, [4,5] juxtaposed and justified the sparsity seeking reconstruction and the attendant ideal sampling techniques. It was not really a big bang as much as a dredging operation: the waters between the Least Squares kingdom and the l_1 island were suddenly removed by these two papers [4,5] and provided a solid theoretical understanding as to why the folks in the l_1 islands were sometimes justified to do what they did. At that point, once the waters were gone, the barbarians could launch a full scale attack. For the sake of the saga, it won't matter what tribe really wins since the two territories will eventually become part of a larger kingdom.
After the dredging operations had occurred and after the drama of particular human affairs have withered, came the use of the land that used to be underwater
As in any human endeavors, after the discovery of new territories, there is new period of introspection and refinement: In our case, the tools got better. In particular, we witnessed the following:
- The emergence of ever faster sparsity seeking reconstruction techniques
- The emergence of different favorable sampling processes. In particular we saw the emergence of sparse sampling operators and a vivid connection to group testing techniques.
- the discovery of optimal measurement bounds
- the learning of dictionaries from data as opposed to a having yet another family in the large families of wavelets
- the learning of different penalties from data. We went from shedding ME for l_1 to considering directly l_0 or devising TV for images or new norms (mixed norms) and finally learning new ones directly from new data.
Much like Game of Thrones, the territory of what is possible is mapped by very sharp phase transition that divide the knowns and the unknowns:
Indeed, what was found during these past nine years was that a combination of certain sampling and reconstruction processes led to the very important discovery of sharp phase transitions which, much like the TV saga, have moved over time .
Soon enough, we are beginning to see the emergence of similar tools in higher dimensional signals such as advanced matrix decomposition with sparsity-like features for recommender systems, machine learning or tools like those developed in Randomized Numerical Linear Algebra (RandNLA)
Thanks to these developments, we are beginning to see some good theoretical reasons as to why some techniques seem to work. Since 2000, people have been using NMF (a different island altogether) but only recently have we seen why it might work. The issue eventually is trying to figure out how to design sensors or data gathering techniques that embody some of these theoretical constraints. But in the back of everybody's mind there is this lingering question "What island is next ?"
 Maximum Entropy and the Nearly Black Object by Donoho, Johnstone, Hoch and Stern
 Yoram Bresler, The Invention of Compressive Sensing and Recent Results: - From Spectrum-Blind Sampling and Image Compression on the Fly to New Solutions with Realistic Performance Guarantees see also The Invention of Compressive Sensing and Recent Results
 Vivek Goyal, Alyson Fletcher, Sundeep Rangan, The Optimistic Bayesian: Replica Method Analysis of Compressed Sensing
 E. J. Candès, J. Romberg and T. Tao. Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inform. Theory, 52 489-509.
 Compressed Sensing, :Information Theory, IEEE Transactions on Information Theory, April 2006, Donoho, D.L.
 A short summary
 Emergence of Simple-Cell Receptive Field Properties by Learning a Sparse Code for Natural Images, Olshausen BA, Field DJ (1996). Nature, 381: 607-609.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.