Sunday, April 14, 2013

Sunday Morning Insight: Game of Thrones and the History of Compressive Sensing

As I was re-reading an old paper on Maximum Entropy and the Nearly Black Object by Donoho, Johnstone, Hoch and Stern. I could not escape contemplating the Games of Thronesque situation seen in signal reconstruction in the past twenty years or so.

For those who have never heard about it, Games of Thrones is a TV series on HBO that features the power play that goes on in several kingdoms. Each episode offers epic battles among and between kingdoms, the saga also features throat cutting murderers, cruel dirty lying bastards and sometimes it even features bad people. All in all, greed and power grab shape the landscape of the series which does a good job at providing a genuine point of view for each actors of this fresco. In turn, I am also always surprised at how fast they join the bandwagon of the powerful with particular ruthlessness.

1. Islands of knowledge.

In season 1, there is this particular situation where some tribe doesn't overtake the rest of the larger kingdom because there is a sea between them and none of the warriors are sea faring. Each kingdom can live in some autarkic manner as the territories seem like large enough islands. In inverse problems reconstruction, there is a similar situation where communities thrive using their own techniques. From the outside looking in, the techniques seem extremely ad-hoc and not well understood. However, they  work very well for each particular universe. The 1990 paper by Donoho et al [1] reveals a collision  between the folks on the l_1 island/house, those on the Maximum Entropy (ME) island/house and those on the Least Squares kingdom.

The "animosity" between the tribes show right through these cryptic words in [1]

"...This paper is being written against a background of some controversy [25,26,23]. "

which features as a reference none other than a column in the Times.

Even though, I have heard stories about compressive sensing being born before 2004 [2,3], let's see  what exactly happened:

In 1990, the authors of [1] are simply hinting that something is working in the sampling process because the original signal is sparse. In 1996, sparse coding shows a wavelet-like structure in the visual cortex [7] that introduces the world to the reality that most signals are sparse in some bases. Only in 2004 [4,5] provided the justification for the sampling strategies on how to successfully grab these sparse signals. While some sensors were already sampling in these "incoherent" bases, [4,5] juxtaposed and justified the sparsity seeking reconstruction and the attendant ideal sampling techniques. It was not really a big bang as much as a dredging operation: the waters between the Least Squares kingdom and the l_1 island were suddenly removed by these two papers [4,5] and provided a solid theoretical understanding as to why the folks in the l_1 islands were sometimes justified to do what they did. At that point, once the waters were gone, the barbarians could launch a full scale attack. For the sake of the saga, it won't matter what tribe really wins since the two territories will eventually become part of a larger kingdom.

After the dredging operations had occurred and after the drama of particular human affairs have withered, came the use of the land that used to be underwater

2. And then what happened ? 

As in any human endeavors, after the discovery of new territories, there is new period of introspection and refinement: In our case, the tools got better. In particular, we witnessed the following:
3. The Walls

Much like Game of Thrones, the territory of what is possible is mapped by very sharp phase transition that divide the knowns and the unknowns:

Indeed, what was found during these past nine years was that a combination of certain sampling and reconstruction processes led to the very important discovery of sharp phase transitions which, much like the TV saga, have moved over time [5].

Soon enough, we are beginning to see the emergence of similar tools in higher dimensional signals such as advanced matrix decomposition with sparsity-like features for recommender systems, machine learning or tools like those developed in Randomized Numerical Linear Algebra (RandNLA)

Thanks to these developments, we are beginning to see some good theoretical reasons as to why some techniques seem to work. Since 2000, people have been using NMF (a different island altogether) but only recently have we seen why it might work. The issue eventually is trying to figure out how to design sensors or data gathering techniques that embody some of these theoretical constraints. But in the back of everybody's mind there is this lingering question "What island is next ?"

[Update, also related The Red Wedding in Science and Technology ]

[1] Maximum Entropy and the Nearly Black Object by Donoho, Johnstone, Hoch and Stern
[3] Vivek Goyal, Alyson Fletcher, Sundeep Rangan, The Optimistic Bayesian: Replica Method Analysis of Compressed Sensing
[4] E. J. Candès, J. Romberg and T. Tao. Robust uncertainty principles: exact signal reconstruction from highly incomplete frequency information. IEEE Trans. Inform. Theory, 52 489-509.
[5] Compressed Sensing, :Information Theory, IEEE Transactions on Information Theory, April 2006, Donoho, D.L.


Sunil Menon said...

Enjoyed this post - very informative and great analogy!

Igor said...

Thanks Sunil !


manny said...

A master piece. Food for thought.

Igor said...

Thanks Manny