Pages

Thursday, July 17, 2008

Compressed Sensing Community: Expanding and Accelerating our Current Understanding

A while back, a discussion went on with one of you. Let me just cut and paste the important part of the exchange:

I was thinking about a more formal software and empirical performance database or repository. You already have quite a collection of software and matrices. Perhaps it is time to begin to formalize, as a community, a suite of experiments/tests upon which to evaluate these algorithms, matrices, etc. I don't know what kind of computing and server resources you have but if you're interested, I'd like to explore doing this.

After I suggested that Sparco was beginning to do some of the things the reader mentioned (this was a little earlier than June), here is what the reader responded:

As for a repository, framework, whatever, I think that Sparco is a good start but it's skewed a bit towards matlab, specific problem instances, and specific recovery tasks. Also, it doesn't serve as a repository for community knowledge. What if I want to know how a certain measurement matrix construction works for all k-sparse signals with a certain noise model, with a certain recovery algorithm? What if I want a more global view than just recovery of a standard image? Furthermore, I think that we need an independent third-party to "host" such a venture .... Everyone reads your blog (whether they admit it or not) but you don't have a research agenda or yourself to promote :) (at least not in the way that others do...) I do think that Sparco is a good start if we add the ability to

1. generate your own signal types
2. generate your own measurement matrices
3. execute your own reconstruction algorithms
4. specify, then record and disseminate, your own experiments
5. gather, analyze, and mine the data from the above different "axes"

Perhaps even a further fork into different application areas---imaging being quite different from 1d problems.
It may seem I am a neutral third party, but I am not really. The emphasis of this blog is mostly about understanding CS better and enabling others to know more about it in a speedy fashion. For researchers that are publishing, this blog should also be the location of the most up-to-date information. I agree with the reader that in order to deal with what I would call the combinatorial amount of possibilities for CS (as mentioned here), we probably need to have either SPARCO or a SPARCO like capability to take a stab at the most promising possibilities. Michael Friedlander has done an excellent job at differentiating SPARCO and SPGL1 for instance and has made SPARCO as agnostic as possible when it comes to using a different reconstruction algorithm. In the past two months, I actually think that SPARCO has gone in the direction the reader asked. For the moment, I have not seen yet is the implementation of the combinatorial algorithms. I am being told the reason is because they are written in C. I am sure that can be taken care of using some matlab wrapping.

Finally, let us also remind ourselves that some of the readers are students that have plenty of times in their hands and want to contribute to the subject without being necessarily in the locations where CS papers are being written or with advisors who know much about CS. Even if their contribution is writing a SPARCO script to do the equivalent of comparison between different reconstruction capabilities, I would be absolutely glad to advertize their work as much as possible.

Any thoughts ?

No comments:

Post a Comment