Let's put things in perspective. Ever since the beginning of this week, we noted that random projections could be used directly in sensing devices ( Transmission of quantum entanglement through a random medium / Terahertz compressive imaging with metamaterial spatial light modulators ) because the unknown is sparse or compressible.
We have also gotten to the point that some of the solvers are very rapid ( as opposed to l1 minimization) as they use short iterative schemes with very few matrix-vector mutliply ( (Sw)AMP solvers in A Second Inflection Point in Genome Sequencing ? and then ... ).
Finally, we noticed that random projections could be used to perform dimension reduction on more complex objects than vectors while still keeping the relevant infirmation at hand (the RandNLA theme as examplified in Randomized Numerical Linear Algebra for Large Scale Data Analysis ( libSkylark: Sketching based Matrix computations for Machine Learning). In effect, the general view is that information can be kept with random projections when the unknown is either sparse or ... not. Information can even be decoded in a quick fashion when the unknown is sparse. Can we do better ? Can we relax some of these conditions ?
We have also gotten to the point that some of the solvers are very rapid ( as opposed to l1 minimization) as they use short iterative schemes with very few matrix-vector mutliply ( (Sw)AMP solvers in A Second Inflection Point in Genome Sequencing ? and then ... ).
Finally, we noticed that random projections could be used to perform dimension reduction on more complex objects than vectors while still keeping the relevant infirmation at hand (the RandNLA theme as examplified in Randomized Numerical Linear Algebra for Large Scale Data Analysis ( libSkylark: Sketching based Matrix computations for Machine Learning). In effect, the general view is that information can be kept with random projections when the unknown is either sparse or ... not. Information can even be decoded in a quick fashion when the unknown is sparse. Can we do better ? Can we relax some of these conditions ?
Maybe
Today, we have an AMP solver (one of the fastest way to go about reconstruction in the sparsity seeking solver area) that aims at reconstructing signals that are not sparse but rather have unknown statistics (after they have been multiplexed with gaussian measurement matrices). I can't wait to see an implementation of it but the World got bigger (we featured a similar solver recently )
Compressed Sensing via Universal Denoising and Approximate Message Passing by Yanting Ma, Junan Zhu and Dror Baron,
We study compressed sensing (CS) signal reconstruction problems where an input signal is measured via matrix multiplication under additive white Gaussian noise. Our signals are assumed to be stationary and ergodic, but the input statistics are unknown; the goal is to provide reconstruction algorithms that are universal to the input statistics. We present a novel algorithm that combines: (i) the approximate message passing (AMP) CS reconstruction framework, which converts the matrix channel recovery problem into scalar channel denoising; (ii) a universal denoising scheme based on context quantization, which partitions the stationary ergodic signal denoising into independent and identically distributed (i.i.d.) subsequence denoising; and (iii) a density estimation approach that approximates the probability distribution of an i.i.d. sequence by fitting a Gaussian mixture (GM) model. In addition to the algorithmic framework, we provide three contributions: (i) numerical results showing that state evolution holds for non-separable Bayesian sliding-window denoisers; (ii) a universal denoiser that does not require the input signal to be bounded; and (iii) we modify the GM learning algorithm, and extend it to an i.i.d. denoiser. Our universal CS recovery algorithm compares favorably with existing reconstruction algorithms in terms of both reconstruction quality and runtime, despite not knowing the input statistics of the stationary ergodic signal.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment