I realize that I am commenting on the matter with some lateness .However, irrespective of the expected outcome (see Dick Lipton's latest entry on the subject), one could already make the following statement from day 1: It is very likely we should not care too much about this result in Compressive Sensing as shown in the argument developed here four months ago: In Compressive Sensing: P= NP + Additional Constraints. Let's not even add to that argument that being P also means an algorithm that scale in O(n^501) is OK as opposed to an NP algorithm that scales as O(exp(0.000000001n)).. At our level, the scales and their attendant constants are enormously important, asymptotics are not enough at least when it comes to the reconstruction solvers..
In unrelated news:
- I set up a static page on the Donoho-Tanner phase transition that helps document its uses and new result on the matter. It is here. Additional insights are welcome.
- The IPAM Modern Trends in Optimization and Its Application workshop has started on September 13th and will continue until December 17th. Some of you are there and you are reading Nuit Blanche while somebody is giving a presentation. Stop that, focus on the presentation, please.
- For $300 you can get your hands on an Emotiv EEG headset. This piece of equipment has just been hacked, so you can now get the raw EEG traces directly. Very nice. I'd like to play with that for sure.
- The LinkedIn Group on Compressive Sensing has reached 591 members, when will it reach its 1000th member is really what I'd like to know.
- Hurricane Igor is set to be monstrous but if History is any indication, all the Czars named Igor were abject failures. Let us hope this one follows their tracks into irrelevancy.
If you think this blog provides a service, please support it by ordering through the Amazon - Nuit Blanche Reference Store
No comments:
Post a Comment