Here is another instance of The Great Convergence in action. Today MMV reconstruction with a Deep architecture. From the graphs, it looks like the Deep architecture is not gaining much in terms of phase transition although it does better than the greedy algorithm.
Exploiting Correlations Among Channels in distributed Compressive Sensing With Convolutional Deep Stacking Networks by Hamid Palangi, Rabab Ward, Li Deng
This paper addresses the compressive sensing with Multiple Measurement Vectors (MMV) problem where the correlation amongst the different sparse vectors (channels) are used to improve the reconstruction performance. We propose the use of Convolutional Deep Stacking Networks (CDSN), where the correlations amongst the channels are captured by a moving window containing the “residuals” of different sparse vectors. We develop a greedy algorithm that exploits the structure captured by the CDSN to reconstruct the sparse vectors. Using a natural image dataset, we compare the performance of the proposed algorithm with two types of reconstruction algorithms: Simultaneous Orthogonal Matching Pursuit (SOMP) which is a greedy solver and the model-based Bayesian approaches that also exploit correlation among channels. We show experimentally that our proposed method outperforms these popular methods and is almost as fast as the greedy methods.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.