Pages

Wednesday, May 05, 2010

CS: [update] One fallout of the MATLAB Programming Contest on Compressive Sensing

I was not expecting this type of feedback, but it fits nicely with an issue I mentioned earlier. From the comment section of yesterday's entry (CS: MATLAB Programming Contest on Compressive Sensing (part three)), anonymous tells us:

I think adaptive vs. fixed sampling makes a huge difference for practical problems ...

Although asymptotically optimal for signals that are highly sparse, in practice CS does not compete that well even with traditional (linear) sampling schemes.

I especially like the comment "simply solves laplace's equation [...]" which shows quite nicely the fact that linear method competes well with L1 and randomized measurement.

You really need to work hard and use stronger prior (TV, sample only high frequencies, use complicated bayesian priors) to make CS better than a simple low pass acquisition on natural images.

The cost for non adaptive CS is typically a factor 5 ~ 7 (with respect to best M term approximation) for a 1% RMS error on Lena image, which is more than the cost of a linear coding.
I have mentioned a similar issue with coded aperture where a MURA like configuration that get you an OK image as compared to the more complex compressive sensing solution (see Is Compressive Sensing a zero-th or a first order capability ?) . The articulation of the argument is provided by R in a clear fashion. R responded with :

Anonymous, I think you have it upside down :).

Compressed sensing does not claim to be an optimal compression scheme. It provides a relatively cheap, possibly universal acquisition protocol. For instance, if sensors are expensive, you can't build an array of them and a CS architecture like Rice's CS camera may be an option... or when acquisition time is an issue (like in MRI) or when transmission (communication) and on-board processing costs are an issue, CS approaches are useful...

To reiterate, so far, CS is not competitive from a rate-distortion point of view but it provides an alternative acquisition scheme when some type of cost (time, money, power, radiation,... ) prohibits traditional sensing.

Thus, what is noteworthy here is that CS (ignoring issues like quantization) comes within a constant of what is possible with "traditional" sensing.

Again, with the current state-of-the-art, CS does not claim to be an optimal coding scheme.
To what anonymous responded with:

Yes you are definitely right.

I think you misunderstood my point. I just wanted to make clear that using CS technics to beat traditional problems (such as coding) simply does not make sense.

In theory it makes sense, because of the asymptotic optimality result (for signals in l^p for p \lt1, CS approximation is asymptotically as good as non-linear approximation). In practice, because of the constant it performs worse even for very large images (that furthermore are not in l^p for p \lt1 ... but this is yet another issue).

And in my opinion, this Matlab challenge is just about this : it is a classical image processing problem, and should definitely not be solved with fixed measures and a blind L1 prior.

This symptomatic of this CS hype that makes many people focus on wrong problems (see also the wired article for a related issue).

I am happy to see that reader of this blog are aware of these difficulties !

I think readers of this blog know about the Wired article and potential misunderstandings that comes with it. However, I would be very curious to see how the best optimized solutions of the Matlab contest fare when given a new set of images at the end of the contest. I am also curious as to how the best solution with l1 prior would fare in this new setting.

No comments:

Post a Comment