Pages

Thursday, January 01, 2009

CS: Bayesian Compressive Sensing via Belief Propagation

Dror Baron just let me know of the release of a new paper by him, Shriram Sarvotham and Rich Baraniuk entitled Bayesian Compressive Sensing via Belief Propagation whose abstract reads:
    This project presents an O(N log^2(N)) decoder using two contributions. First, we use CS-LDPC encoding matrices, which consist mostly of zeros, where nonzero entries are {-1,+1}. Second, we incorporate a prior on the signal model (and possible measurement noise model) via belief propagation (BP), a well-known technique for performing approximate Bayesian inference. Our specific example mostly use a two-state mixture Gaussian signal model, but other models can be incorporated in the code.
Dror also kindly provided me also some context to the paper and provided the attendant Matlab code implementing the CS-BP algorithm as well as a much detailed listing of how the functions should be used. Here is the context:
Linear programming CS decoding received much initial attention, yet requires at least quadratic (or even cubic) computation, which is excessive for decoding large signals such as million-pixel images. Decoding is computationally challenging, because we need to (i) take products of the measurement matrix with the signal vector, and (ii) many algorithms iterate over the data numerous times.

To reduce computation, we propose to use sparse CS encoding matrices that offer fast matrix-vector multiplication. In a previous paper, we used these matrices for decoding noiseless measurements of strictly sparse data. The current paper considers problems where the data is not strictly sparse and measurements may be noisy. To approach these problems, we use a Bayesian setting. We focus on a two-state mixture Gaussian signal model, but more sophisticated signal and noise models can be incorporated. Our decoder relies on belief propagation (BP), a technique that is commonly used for signal estimation in various signal processing applications and for channel decoding of turbo and low density parity check (LDPC) codes. The CS-BP decoder represents the sparse CS encoding matrix by a sparse bipartite graph. In each iteration, nodes that correspond to signal coefficients and measurements estimate their corresponding conditional probabilities and then convey this information to neighboring nodes. Because the encoding matrix is sparse, the bipartite graph is also sparse, and so each CS-BP iteration runs in O(N log(N)) time, where the number of iterations is logarithmic in N. Numerical results are encouraging, and the Matlab code is available online.

Thanks Dror!.

Credit Photo: NASA/JPL-Caltech/Cornell, the road not taken by Opportunity.

No comments:

Post a Comment