Pages

Wednesday, July 13, 2011

Nuit Blanche Reader's Mailbag

Bob Sturm provided a new correction to his implementation of CoSaMP and Subspace Pursuit algorithms. If this is not a huge justification for having written Nobody cares about you and your algorithm, I don't know what is. Currently I am playing with some of the sparse PCA and Low Rank codes around and keep on getting mex errors for a few of them. So even when this is written in Matlab, you're not communicating with your target audience if your algorithm cannot be instantiated on the users' computer.

Some anonymous commenters mentionned about Facing Mona Lisa:

The discussion on this blog is very "scientific" and I havent started to read up on it yet. I was imagining that it would be possible to use these methods to create a new compression-algorithm for images, like a new JPEG or ECW or MrSID- format. What is your idea on this?
Thanks for a nice blog and I will try to read up on compressed sensing!

To what I responded with:
Compressive sensing provides a simple way of compressing data however, it is also known that a naive approach to it yield schemes that are not as optimal compared to jpeg or wavelet based compression encodings. Less naive schemes include adaptive compressive sampling and/or structured sparsity based decoders and are likely to yield better results than the naive approach. However, if one looks at the semi-success of jpeg2000, a wavelet based encoding system, it looks like the era of better compression systems does not have bright future.

In the case of this sensor however, a jpeg or jpeg2000 encoding system would not make sense as the compressed data is already gathered through the readings of the voltage of the chip. What is needed is a better reconstruction solvers that takes these readings from voltages to an images. The point I made in this entry is that there are better reconstructions solvers than the least square solution (which seems to have been used by the authors of the paper) especially for a system that looks like it is underdetermined such as happens in compressive sensing....

Laurent Duval added:

You said "Compressive sensing provides a simple way of compressing data however, it is also known that a naive approach". I would not be in complete accordance with that. Compressive is (still IMHO) somehow an adjective to sensing, as (seemingly) pointed out by Bob Sturm here: "I agree that compressed sensing is much more a method of acquisition than it is a method for compression.", more than a way to compress stuff. Actual compression generally needs some amount of engineering wizardy than goes far beyond any "sparsity" assumptions. In a word: Fourier+Huffman < local Fourier (DCT)+ Huffman but local (8x8) wavelet + Huffamn < local Fourier (DCT)+ Huffman. Yet Wavelet + Zerotree > local Fourier (DCT)+ Huffman. Again, Wavelet + Zerotree ~ DCT + zerotree, both < to Windowed Fourier (GenLOT like) + treillis. And the story may not be over yet. There is some intertwining between sparsity yielded by transform and coding, AND expectation. Poor sparsity + black magic coding may perform better than highest sparsity and 2nd order entropy. As for: "However, if one looks at the semi-success of jpeg2000, a wavelet based encoding system, it looks like the era of better compression systems does not have bright future." I believe this assertion deserves some further historical, time-to-market and corporate weighting and a reference to Sentience is indescribable by Daniel Lemire. When we can describe the forest... but  progress on NP or PCP are needed first.


Eric Tramel replied with

WRT the "next era" of coding methods, I think we're still on the look out for the next wave. Currently, advanced coders, like Laurent mentioned, are mostly completely hyper-engineered, 100-mode contraptions that can squeeze compression out of rocks. H.264 for video is a hairy beast of corner cases, but it is my understanding that H.265 makes it look simple by comparison, all for ~30% rate-distortion performance gain.

CS is attractive for its simplicity, and that despite its simplicity it has passable performance in practice (in some applications). In order for CS to become the next thing in coding, measurement quantization will need to be fully understood and we'll need more accurate solvers (i.e. ones that make the most out of signal models & priors). I don't see this happening at a rate that would put CS on par with anything but last-gen traditional coding techniques. But of course, CS isn't intended for these purposes :P

TL;DR: Still looking for something elegant for the next gen of coders..

and then added

nother comment that is more on the topic of what you talked about in the post, Igor:

You'd be surprised what you can recover from. Well, maybe you wouldn't be, but others might. We talk a lot about making sure that the projector, \Phi, A, etc, matches bounds to ensure performance. But, legibility can be achieved with even really, really, really crappy & poorly conditioned projectors and a handful of measurements. This is potentially useful for some very simple vision applications.

So, it is very possible that, as you are suggesting, an L1 solver approach could provide some kind of performance gain. They just have to make sure that they characterize the effective projection accurately :P Also remember that they are dealing with a single, non-reconfigurable, shot. So, they are also limited by the number of sensors on the chip. It is possible that for a low sensor resolution and a single capture, a maximum liklihood approach might make more sense than an L1 solution.

But, I'm having the same problem you stated at the beginning of the article. I can't find the paper! I went to look for it right after reading the press release but came up empty. It could be that they are waiting for some kind of patent application to go through...

No paper makes it hard to tell what they're doing exactly for recovery or what kind of trade-offs they're making in design :'(

As I have said before, i have been here before. The main difficulty for another paper to be a compressive sensing paper based on this detector design would be to have a good characterization of its transfer function and see if it fits the Donoho-Tanner phase transition. Also I think I made my views clear that indeed in some circumstances, we really are performing sparsity sensing as opposed to compressive sensing.For the rest I am bit more optimistic than Laurent because for examples such as image processing, we have mostly six billion entities on earth with a capable processor even through none of them grew in the same fashion as the others.

Finally, Stephen Becker sent me the following:


Hi Igor, I found a bad link for Regularized OMP (ROMP) on your list of solvers (section 4.1 of http://sites.google.com/site/igorcarron2/cs )
The bad link is: http://www.math.ucdavis.edu/%7Edneedell/romp.m(also, http://www.math.ucdavis.edu/~dneedell/ is bad)
an updated link is:
http://www-stat.stanford.edu/~dneedell/romp.m
While I was thinking about this, I realized there are caltech links that need to be updated.  We merged departments, so the "acm" website is permanently down, and hence our solver NESTA
http://www.acm.caltech.edu/~nesta/
has moved to a new home:http://www-stat.stanford.edu/~candes/nesta/

Also, my website (formerly http://www.acm.caltech.edu/~srbecker) is now http://ugcs.caltech.edu/~srbecker).  Our TFOCS solver URL remains the same.  On a related note, I leave Caltech soon, and have written a thesis that you or your readers may find interesting:http://resolver.caltech.edu/CaltechTHESIS:06022011-152525054"Practical compressed sensing: modern data acquisition and signal processing".
And some old websites of (former) caltech folks are being removed, so I've listed them below, along with the new links:
http://www.eecs.umich.edu/~wakin/( now http://www.mines.edu/~mwakin )http://www.acm.caltech.edu/~jrom/( now http://users.ece.gatech.edu/justin )http://www.acm.caltech.edu/~emmanuel/ (2 locations)( now http://www-stat.stanford.edu/~candes/ )
While I was at it, I checked other links on this site (http://sites.google.com/site/igorcarron2/cs) using a validator (http://validator.w3.org/checklink ) since I'd noticed that other links were old as well.  Turns out a lot of them are old! Here's a list:
-- Missing targets --http://homepages.cae.wisc.edu/~jhaupt/http://sites.google.com/site/igorcarron2/tr_1231851202623http://sites.google.com/site/igorcarron2/tr_1267470689827http://sites.google.com/site/igorcarron2/goog_2029081109http://sites.google.com/site/igorcarron2/tr_1267456797996http://www.mit.edu/~texel/http://www.ece.rice.edu/~drorb/CSBP/http://www-personal.umich.edu/~markiwen/http://www.math.lsa.umich.edu/~cvspenc/http://www.bu.edu/systems/people/students.htmlhttp://www.academie-sciences.fr/membres/M/Meyer_Yves.htmhttp://www.stanford.edu/~idrorihttp://www.csl.uiuc.edu/directory/detail_dir1.asp?NetID=weidaihttp://www.math.univ-paris13.fr/~malgouy/software/bpdn_web.zipftp://ftp.math.ucla.edu/~tao/preprints/sparse.html (meant to be http not ftp???)http://www.ceremade.dauphine.fr/~peyre/upload/cs-tv/html/content.htmlhttp://ee.sharif.ir/~ghmohimani/http://www.stanford.edu/~singerhttp://www.stanford.edu/~tsaighttp://www.math.uchicago.edu/faculty.htmlhttp://www.math.ucla.edu/~tagoldst/FLASH_c.ziphttp://web.mit.edu/texel/www/http://www.tele.ucl.ac.be/~jacques/ (twice; "forbidden". link to index.html instead?)http://www.stanford.edu/~sjkim/http://www.stanford.edu/~deneb1/http://www.caam.rice.edu/~eth3881/http://math.berkeley.edu/~shamgar/http://www.public.asu.edu/~jliu86/http://www.its.caltech.edu/~amin/weighted_l1_codes/http://www.ece.osu.edu/~schniter
-- other links (permanently gone, or servers temporarily down?) --
http://www.columbia.edu/~zw2109/http://www.math.utah.edu/~tanner/ ("forbidden". link to index.html instead?)http://www.ece.osu.edu/~zinielj/fbmp/index.html  ("forbidden")http://www.ece.osu.edu/~zinielj  ("forbidden")http://www.ece.osu.edu/~zinielj/fbmp/download.html  ("forbidden")http://www.ece.osu.edu/~potter   ("forbidden")http://www.stanford.edu/~hllee/softwares/nips06-sparsecoding.htm("forbidden")http://www.cs.washington.edu/homes/venkat/   ("forbidden")
http://sina2jp.googlepages.com/sinajafarpour'shomepage (the apostrophe is probably not right)
http://www.math.sfu.ca/~zhaosong/Codes/Compressed_Sensing(redirected to) http://people.math.sfu.ca/~zhaosong/Codes/Compressed_Sensing, (which isn't found either)
http://www.math.yale.edu/~sl349/demos_data/demos.htm(redirected to) http://users.math.yale.edu/~sl349/demos_data/demos.htm, (which isn't found)
http://www.convexoptimization.com/wikimization/index.php/Matlab_for_Convex_Optimization_%26_Euclidean_Distance_Geometry(broken due to some punctuation in the link)

http://www.emmanuel-ravelli.com/downloads   (can't find host)http://jbobin.shje.org/  (can't find host -- this site is gone)http://www-math.univ-fcomte.fr/pp_Annu/SCHRETIEN/   (can't connect)

Thanks for all your work indexing these pages and the almost daily updates! We appreciate it.
-Stephen

Thanks Stephen, I needed work :-) I'll try to focus on the big ones, other sites may become permanently silent as a result of this audit though.

Image Credit: NASA/JPL/Space Science Institute
W00067992.jpg was taken on July 10, 2011 and received on Earth July 11, 2011. The camera was pointing toward SATURN at approximately 291,728 kilometers away, and the image was taken using the CB2 and CL2 filters. 


No comments:

Post a Comment