Pages

Wednesday, March 31, 2010

CS: Random Matrix Theory, Randomized Kaczmarz Solver, Streaming compressive sampling, LASSO and Dantzig Selector


Muthu is looking for a postdoc. Today we have a mostly theoretical entry ending with a real application:

We discuss applications of some concepts of Compressed Sensing in the recent work on invertibility of random matrices due to Rudelson and the author. We sketch an argument leading to the optimal bound (N^(-1/2)) on the median of the smallest singular value of an N x N matrix with random independent entries. We highlight the parts of the argument where sparsity ideas played a key role.
The attendant presentation is here. Also from Roman Vershynin here are the slides of a presentation entitled: Random matrix theory from the functional analytic perspective

The connection between the ART algorithm and the Kaczmarz algorithm made me re-read: Randomized Kaczmarz Solver for Noisy Linear Systems by Deanna Needell. The abstract reads:
The Kaczmarz method is an iterative algorithm for solving systems of linear equations Ax = b. Theoretical convergence rates for this algorithm were largely unknown until recently when work was done on a randomized version of the algorithm. It was proved that for overdetermined systems, the randomized Kaczmarz method converges with expected exponential rate, independent of the number of equations in the system. Here we analyze the case where the system Ax = b is corrupted by noise, so we consider the system Ax ≈ b + r where r is an arbitrary error vector. We prove that in this noisy version, the randomized method reaches an error threshold dependent on the matrix A with the same rate as in the error-free case. We provide examples showing our results are sharp in the general context.
Also of interest: Noisy Signal Recovery via Iterative Reweighted L1-Minimization by Deanna Needell. The abstract reads:

Compressed sensing has shown that it is possible to reconstruct sparse high dimensional signals from few linear measurements. In many cases, the solution can be obtained by solving an ℓ1-minimization problem, and this method is accurate even in the presence of noise. Recent a modified version of this method, reweighted ℓ1-minimization, has been suggested. Although no provable results have yet been attained, empirical studies have suggested the reweighted version outperforms the standard method. Here we analyze the reweighted ℓ1-minimization method in the noisy case, and provide provable results showing an improvement in the error bound over the standard bounds.
The attendant slides presenting this paper are here.

Compressive sampling (CS) has emerged as significant signal processing framework to acquire and reconstruct sparse signals at rates significantly below the Nyquist rate. However, most of the CS development to-date has focused on finite-length signals and representations. In this paper we discuss a streaming CS framework and greedy reconstruction algorithm, the Streaming Greedy Pursuit (SGP), to reconstruct signals with sparse frequency content. Our proposed sampling framework and the SGP are explicitly intended for streaming applications and signals of unknown length. The measurement framework we propose is designed to be causal and implementable using existing hardware architectures. Furthermore, our reconstruction algorithm provides specific computational guarantees, which makes it appropriate for realtime system implementations. Our experimental results on very long signals demonstrate the good performance of the SGP and validate our approach.

M. Salman Asif also presented On the LASSO and Dantzig Selector Equivalence at CISS'10.


Finally using random features to make an encrypting device, mmuhh...

In this paper, we propose a secure anti-counterfeiting solution based on the extraction of a unique identi er out of the random pro le of laser marks that are engraved on the surface or in the bulk of physical objects. Given today's technology, the 3D pro le of a laser mark can be considered uncloneable. Indeed, reproducing a mark would be so expensive that it thwarts any attempt to build a twin of a given mark. Actually, we rely on the resolution di erence that can be achieved between the engraving which can be seen as pretty coarse, and the much more precise reading process. The solution we propose exploits these physical features in the design of a comprehensive anti-counterfeiting system based on existing tools. Experimental results demonstrate the feasibility of using the framework for real applications.

No comments:

Post a Comment