Sometimes I lose track of this, but Compressed Sensing is really about acquiring a sparse signal with the help of an incoherent basis. While much of the emphasis is rightfully about reconstruction work, we are beginning to see the emergence of other tools that complete this framework. Two other stages are emerging:
- the search for bases or dictionaries in which a set of signals are sparse in and,
- specific incoherent measurements tools.
- Multiscale sparse image representation with learned dictionaries
- Efficient sparse coding algorithms
- Non-negative Sparse Modeling of Textures (NMF)
Non Adaptive Measurement codes/matrices (the first four are pasted from Terry Tao site):
Other recent implementations achieving specific goals (sparsity,....)Random Fourier Ensemble: The signal is a discrete function f on Z/NZ, and the measurements are the Fourier coefficients at a randomly selected set Omega of frequencies of size M ( A is an M x N matrix.) Gaussian ensemble: A is an M x N matrix (M x N Gaussian variables). Bernoulli ensemble: A is an M x N matrix (M x N Bernoulli variables). Gaussian ensemble: A is an m x n matrix (m > n) whose measurement coefficients are Gaussian variables.
- Sparse Measurement Matrix (implementation is mine, but real code available from the authors).
- Noiselets
- Random Filters (from here)
Reconstruction codes - Matching Pursuit/Greedy, Basis Pursuit/Linear Programming, Bayesian -
- l1-Magic
- SparseLab
- GPSR
- ell-1 LS: Simple Matlab Solver for ell-1-Regularized Least Squares Problems
- sparsify
- MPTK: Matching Pursuit Toolkit
- Bayesian Compressive Sensing
- SPGL1: A solver for large scale sparse reconstruction
- FPC
- Chaining Pursuit
- Regularized OMP
- TwIST
- Lp_re, Reweighted Lp - This my implementation but a better one might be available from the authors.
[ Update: some of the homemade codes are here]
Credit: I have cut and pasted section of this entry directly from the Rice repository, Terry Tao site. NASA Mars rover Opportunity on Sol 1432.
I have put a comment on Tao's blog concerning the use of more advanced models for CS recovery.
ReplyDeleteI know, I put in a comment on adaptive methods and it seems that it did not get approval :-)
ReplyDeleteThanks for the pointer though.
Igor.
Hi,
ReplyDeleteinteresting post,
Actually, I am looking for a piece of code to play in order to sparsely represent difference between two classes of images. Perhaps, NMF methods is a good start. Although most of them including "Efficient sparse coding" can be potentially applied for supervised classification, they are usually used to sparsely represent one image ensemble regardless of label. In the other words, the sparse representation does not exhibit the difference between two groups optimally and sparsely but it represents the whole ensemble sparsely which does not necessary mean that it yields better classification between two groups. In the other hand, methods like Sparse LDA produce sparse features but they are computationally expensive for structured/high dim. data like images.
Now, I want to play with this:
http://www.cs.helsinki.fi/u/phoyer/software.html
or this:
http://www.csie.ntu.edu.tw/~cjlin/nmf/index.html
do you have any suggestion?
Thanks,
Kayhan
Kayhan,
ReplyDeleteI am not forgetting your previous request, but I think I am answering bit by bit in the entries I am putting forth.
Anyway, I did not know about the second reference, both seem ok. I think Hoyer has also looked at tensor factorization which basically opens the door to more than 2 dimensional sparse decomposition which might be of interest when looking at data of dimension higher than 2. Please also note that Gabriel Peyre has found that a sparsity constraint is not helping much in getting a good dictionary. I'd say it's a good thing to try these things and eventually get some wisdom about what to use and what to not use.
I realize this is not a more pointed answer but it also means that there are a lot of low hanging fruits as well.
Igor.