Monday, March 16, 2009

CS: Emerging Trends in Visual Computing, High-Rate LDPC Codes in Compressed Sensing, Sparsity in Everything, Stats on the LinkedIn CS Group.

Frank Nielsen just let me know that the proceedings of the Emerging Trends in Visual Computing (ETVC'08) conference  that took place just came out. All the videos of the workshop are here.

I have mentioned the workshop before here, herehere, here and here. Out of the videos I have not mentioned here is one by  Martin Vetterli entitled Sparse Sampling: Variations on a Theme by Shannon.



Fan Zhang and Henry D. Pfister just released on arxiv  On the Iterative Decoding of High-Rate LDPC Codes With Applications in Compressed Sensing.

This paper considers the performance of (j,k)-regular low-density parity-check (LDPC) codes with message-passing (MP) decoding algorithms in the high-rate regime. From a coding perspective, this analysis is interesting for a variety of channels including the binary erasure channel (BEC) and the q-ary symmetric channel (q-SC). The first result is that, for the BEC, the density evolution (DE) threshold of the best decoder scales as \Theta(k^{-1}) and the critical stopping ratio scales as \Theta(k^{-j/(j-2)}).

Abstract-- The analysis for the verification decoding with q-SC is applicable to the compressed sensing (CS) of strictly-sparse signals. Of particular note is the performance of CS systems based on LDPC codes and MP decoding. The analysis based on DE/stopping set analysis is used to analyze the CS systems with randomized/uniform reconstruction. The results show that strictly sparse signals can be reconstructed with a constant oversampling ratio when the number of measurements scales linearly with the sparsity of the signal.
The Lasso is an attractive technique for regularization and variable selection for high-dimensional data, where the number of predictor variables $p_n$ is potentially much larger than the number of samples $n$. However, it was recently discovered that the sparsity pattern of the Lasso estimator can only be asymptotically identical to the true sparsity pattern if the design matrix satisfies the so-called irrepresentable condition. The latter condition can easily be violated in the presence of highly correlated variables. Here we examine the behavior of the Lasso estimators if the irrepresentable condition is relaxed. Even though the Lasso cannot recover the correct sparsity pattern, we show that the estimator is still consistent in the $\ell_2$-norm sense for fixed designs under conditions on (a) the number $s_n$ of nonzero components of the vector $\beta_n$ and (b) the minimal singular values of design matrices that are induced by selecting small subsets of variables. Furthermore, a rate of convergence result is obtained on the $\ell_2$ error with an appropriate choice of the smoothing parameter. The rate is shown to be optimal under the condition of bounded maximal and minimal sparse eigenvalues. Our results imply that, with high probability, all important variables are selected. The set of selected variables is a meaningful reduction on the original set of variables. Finally, our results are illustrated with the detection of closely adjacent frequencies, a problem encountered in astrophysics.


I am looking for input for the Sparsity in Everything series that I am running. It is located at:


I have several ideas but I would not mind additional examples.

Finally, the LinkedIn group on compressive sensing has now reached 120 people. There is not much traffic yet there in the discussion section however, I went through the listing and here is the approximate percentage of professions being represented there:
  • 39 % are students
  • 5 % are post-docs
  • 13% are professors
  • 9% are researchers (mostly in industry)
  • and most importantly 34% are professionals outside of academia.

Therefore about 42% of the people in the group are outside the traditional academic tracks. This is not a bad thing for an area of investigation that started only four years ago. With regards to geographical location, about 49% are located in the Americas (mostly the U.S.), 32 % are located in Europe and 19% are located in Asia. Again, I am not sure that interest in the field of wavelets back in the late 1980's spread that widely so fast after the first few papers on the subject.


No comments:

Printfriendly