Pages

Friday, January 25, 2008

Compressed Sensing: Building an Overcomplete Basis to Make Your Signal Sparse: Texture Decomposition using NMF

The problem of finding the sparsest decomposition of a signal is intrinsically linked to finding one or several bases for the signal at hand. Either one is lucky enough to choose from a certain sets of known functions like the Gabor functions in EEG signals or these functions are so involved that a more empirical approach is required, i.e. Find as many examples as possible and try to decompose these training examples into 'atoms' that form some type of basis. I previously noted the very fast approach of Honglak Lee, Alexis Battle, Rajat Raina, Andrew Ng in Efficient sparse coding algorithms but was also surprised that no use of Non-Negative Matrix factorization had been undertaken. I was wrong, Gabriel Peyre did that for textures in Non-negative Sparse Modeling of Textures

The abstract reads:
This paper presents a statistical model for textures that uses a non-negative decomposition on a set of local atoms learned from an exemplar. This model is described by the variances and kurtosis of the marginals of the decomposition of patches in the learned dictionary. A fast sampling algorithm allows to draw a typical image from this model. The resulting texture synthesis captures the geometric features of the original exemplar. To speed up synthesis and generate structures of various sizes, a multi-scale process is used. Applications to texture synthesis, image inpainting and texture segmentation are presented.


Please note his comment that the sparsity requirement does not help much:

An explicit sparsity prior can be added into this process [23] to enforce the constraints $||xj ||_0 < \tau$  required by equation (1). In practice, this did not result in a noticeable enhancement for texture modeling.
This is surprising (see Sparsity or positivity ?) but one might not be so lucky for other types of signals. Let us not forget that while curvelets or contourlets seem doing a good job at describing contours and therefore enable a better Compressive Sampling, they do not address textures (which is the reason of this paper), nor time related or hyperspectral compressible bases.

2 comments:

  1. Hi,

    nice post as usual! How do you describe a relation between new methods like this and older ones like that of Victor Wickerhauser or papers like this:
    Image approximation and modeling via least statistically dependent bases, Naoki Saito, Pattern Recognition ,2001, 1765-1784

    I mean, do you think that CS urged people to revisit the idea? Because it was there for such long time and nobody applies it practically.

    Thanks,

    ReplyDelete
  2. Hello Kayhan,

    I looked at it
    http://www.math.ucdavis.edu/~saito/publications/saito_iam_lsdb.pdf
    I would not say it is an older method as in it is not useful anymore, rather NMF can be used on dimensional space larger than two where we already have a whole slew of bases and it finds these higher dimensional bases. But irrespective to that dimensionality aspect, I think one of the reason NMF is interesting is because it is a simple iterative scheme which also seems to enforce other kind of constraints by default (sparsity note mentioned in the entry). I think people in CS are agnostic with regards to the method of finding dictionaries as long as it provides it in a fast and efficient manner (take a look at the stanford paper and see the improvement of speed in the past seven years). Then the whole CS work is finding algorithms or hardware implementations acquiring signals and the means of their reconstruction is the main thrust of the current activities.

    Igor.

    ReplyDelete