Friday, April 15, 2011

CS: "Compressible" Realization of Gaussian Noise

I just improved the script I wrote earlier so that it would:

  • produce graphs with labels
  • provide a second map that features the variables used by Ori Shental in his paper

Instead of SL0, I have tried the simple basis pursuit of SPGL1 and it doesn't work, i.e. I cannot find a "compressible decomposition" of the realization of a Gaussian vector. I have not tried any other L_1 solvers (including LASSO or Dantzig Selector...) so there are ample experiments to perform here. I was also told that m might be too small and what was observed was a fluke. So instead of m = 50, I went for m =250, the effect is larger. Here are the figures for SL0:

Here is the attendant script. I used the new variable kappa = k/N and alpha = m/N. Let me summarize what the first figure says:

Given a Gaussian vector of size 250, a Gaussian dictionary made up of 2500 Gaussian realization will on average provide a decomposition of said vector with as few as 148 elements of the dictionary with a 1% error.

One can note that the second figure shows a relatively small region of interest compared to the one found by Ori in his paper. Is it a question of using a different solver ? and if so which one ? Otherwise I have also noticed that it may be dependent on m but my small computer cannot handle this. To check if this is the case, one could modify the code from

m =250;
itr = 10;
for N = m:100:2500


m =1000;
itr = 10;
for N = m:500:10000

No comments: