Thursday, April 23, 2009

CS: Clustered Sparse Signals, Matrix Completion code, FRI, Deterministic CS, Union of Frame

Here is thhe new batch today. most were found on the interwebs while others come from Rice,

Recovery of Clustered Sparse Signals from Compressive Measurements by Volkan Cevher, Piotr Indyk, Chinmay Hegde and Richard G. Baraniuk. The abstract reads:

We introduce a new signal model, called (K,C)-sparse, to capture K-sparse signals in N dimensions whose nonzero coefficients are contained within at most C clusters, with C \lt K \lt \lt N. In contrast to the existing work in the sparse approximation and compressive sensing literature on block sparsity, no prior knowledge of the locations and sizes of the clusters is assumed. We prove that O (K + C log(N/C))) random projections are sufficient for (K,C)-model sparse signal recovery based on subspace enumeration. We also provide a robust polynomial time recovery algorithm for (K,C)-model sparse signals with provable estimation guarantees

I mentioned this paper earlier: Matrix Completion from a Few Entries by Raghunandan Keshavan, Andrea Montanari, and Sewoong Oh. The software is now available here.



A Method for Generalized Sampling and Reconstruction of Finite-Rate-of-Innovation Signals by Chandra Sekhar Seelamantula, Michael Unser. The abstract reads:
We address the problem of generalized sampling and reconstruction of finite-rate-of-innovation signals. Specifically, we consider the problem of sampling streams of Dirac impulses and propose a two-channel method that enables fast, local reconstruction under some suitable conditions. We also specify some acquisition kernels and give the associated reconstruction formulas. It turns out that these kernels can also be combined into one kernel, which can be employed in the single-channel sampling scenario. The single-kernel approach carries over all the advantages of the two-channel counterpart. Simulation results are presented to validate the theoretical calculations.

Structured Variable Selection with Sparsity-Inducing Norms by Rodolphe Jenatton, Jean-Yves Audibert, Francis Bach. The abstract reads:

We consider the empirical risk minimization problem for linear supervised learning, with regularization by structured sparsity-inducing norms. These are defined as sums of Euclidean norms on certain subsets of variables, extending the usual $\ell_1$-norm and the group $\ell_1$-norm by allowing the subsets to overlap. This leads to a specific set of allowed nonzero patterns for the solutions of such problems. We first explore the relationship between the groups defining the norm and the resulting nonzero patterns, providing both forward and backward algorithms to go back and forth from groups to patterns. This allows the design of norms adapted to specific prior knowledge expressed in terms of nonzero patterns. We also present an efficient active set algorithm, and analyze the consistency of variable selection for least-squares linear regression in low and high-dimensional settings.

Found the Rice Compressive Sensing repository:

Compressed Sensing aims to capture attributes of a sparse signal using very few measurements. Candes and Tao showed that random matrices satisfy the Restricted Isometry Property or RIP property with high probability. They showed that this condition is sufficient for sparse reconstruction and that random matrices, where the entries are generated by an iid Gaussian or Bernoulli process, satisfy the RIP with high probability. This approach treats all k-sparse signals as equally likely, in contrast to mainstream signal processing where the filtering is deterministic, and the signal is described probabilistically. This paper provides weak conditions that are sufficient to show that a deterministic sensing matrix satisfies the Statistical Restricted Isometry Property (StRIP), and to guarantee the uniqueness of k-sparse representations. The columns of the sensing matrix are required to form a group under pointwise multiplication, and it is this property that provides the concentration inequalities of McDiarmid and Bernstein with the leverage necessary to guarantee uniqueness of sparse representations. We provide a large class of deterministic sensing matrices for which Basis Pursuit is able to recover sparse Steinhaus signals. The new framework encompasses many families of deterministic sensing matrices, including those formed from discrete chirps, Delsarte-Goethals codes, and Extended BCH codes. In these cases it provides theoretical guarantees on the performance of nonlinear reconstruction algorithms with complexity that is only quadratic in the dimension of the measurement domain.

Sampling in a union of frame generated subspaces by Magali Anastasio and Carlos Cabrelli. The abstract reads:
A new paradigm in Sampling theory has been developed recently by Lu and Do. In this new approach the classical linear model is replaced by a non-linear, but structured model consisting of a union of subspaces. This is the natural approach for the new theory of compressed sampling, representation of sparse signals and signals with finite rate of innovation. In this article we extend the theory of Lu and Do, for the case that the subspaces in the union are shift-invariant spaces. We describe the subspaces by means of frame generators instead of orthonormal bases. We show that, the one to one and stability conditions for the sampling operator, are valid for this more general case.


Image Credit: NASA/JPL/Space Science Institute, Saturn as seen two days ago by Cassini.

3 comments:

Sina said...

Hi Igor,

Just as a short comment, this new result shows that the probability of deviation from the StRIP property for the aforementioned matrices is actually "exponentially small". This is much more consistent with the empirical results of the CISS paper, and provides a better understanding on what is behind the deterministic coding matrices.

Anonymous said...

Hi Igor,

we made available a c version of the matrix completion algorithm, at the
same URL
http://www.stanford.edu/~raghuram/recon/index.html

Is is pretty fast.

Thanks,
Andrea Montanari

Igor said...

Thanks Andrea,

I will feature this tomorrow. Thanks for the tip.

Cheers,

Igor.

Printfriendly