Rick Chartrand, Nonconvex compressive sensing and reconstruction of gradient-sparse images: random vs. tomographic Fourier sampling. The abstract reads:
Previous compressive sensing papers have considered the example of recovering an image with sparse gradient from a surprisingly small number of samples of its Fourier transform. The samples were taken along radial lines, this being equivalent to a tomographic reconstruction problem. The theory of compressive sensing, however, considers random sampling instead. We perform numerical experiments to compare the two approaches, in terms of the number of samples necessary for exact recovery, algorithmic performance, and robustness to noise. We use a nonconvex approach, this having previously been shown to allow reconstruction with fewer measurements and greater robustness to noise, as confirmed by our results here.
In a previous post, I mentioned that Rick Chartrand and Valentina Staneva had just submitted a paper on the subject of Restricted isometry properties and nonconvex compressive sensing. It is now available.
In another more theoretical area, Mark Rudelson, Roman Vershynin just released The Littlewood-Offord Problem and invertibility of random matrices. From the commentary section:
Here we prove two basic conjectures on the invertibility of random matrices. One of them has been a prediction of Von Neumann and his associates, who speculated that the condition number of a random n by n matrix with i.i.d. entries should be linear in n (with high probability). The condition number is the ratio of the largest to the smallest singular values of the matrix, or equivalently the product of the norms of the matrix and its inverse. .....Thus the norm of random matrix and its inverse are both of order n^{1/2} with high probability. This confirms Von Neumann's prediction in full generality.
We prove two basic conjectures on the distribution of the smallest singular value of random n×n matrices with independent entries. Under minimal moment assumptions, we show that the smallest singular value is of order n^(−1/2), which is optimal for Gaussian matrices. Moreover, we give a optimal estimate on the tail probability. This comes as a consequence of a new and essentially sharp estimate in the Littlewood-Offord problem: for i.i.d. random variables X_k and real numbers a_k, determine the probability p that the sum ( a_k X_k) lies near some number v. For arbitrary coefficients a_k of the same order of magnitude, we show that they essentially lie in an arithmetic progression of length 1/p.
Finally, Sanjoy Dasgupta and Yoav Freund describe the intriguing Random projection trees and low dimensional manifolds. The abstract reads:
We present a simple variant of the k-d tree which automatically adapts to intrinsic low dimensional structure in data without having to explicitly learn this structure.and its companion NIPS paper (Yoav Freund, Sanjoy Dasgupta, Mayank Kabra, Nakul Verma, Learning the structure of manifolds using random projections). I wonder when a matlab code will be made available.
Credit Photo: SPI coded aperture detector, one of the main isntrument on the ESA INTEGRAL satellite. From F. Sancheza,*, R. Chatob, J.L. Gasentb, J. Rodrigob, T. Velascob, A coded mask for gamma-ray astronomy. Design and calibration, Nuclear Instruments and Methods in Physics Research A 500 (2003) 253–262.
No comments:
Post a Comment