Pages

Friday, December 16, 2016

Onsager-Corrected Deep Networks for Sparse Linear Inverse Problems - tensorflow implementation-

 
 
After probably reading the previous post on Interpretable Recurrent Neural Networks Using Sequential Sparse Recovery and me wondering about AMP, Phil just sent me the following:
 
Hi Igor,

I wanted to mention that we have some recent work on interpretable feed-forward networks based on the vector approximate message passing (VAMP) algorithm:

https://arxiv.org/abs/1612.01183

Some slides can be found here:

http://www2.ece.ohio-state.edu/~schniter/pdf/itw16_lvamp_slides.pdf

and a tensorflow implementation can be found here:

https://github.com/mborgerding/onsager_deep_learning

Thanks for maintaining such a great blog.

Cheers,
Phil
Awesome, thank you Phil ! Here is the paper: Onsager-Corrected Deep Networks for Sparse Linear Inverse Problems by Mark Borgerding, Philip Schniter

Deep learning has gained great popularity due to its widespread success on many inference problems. We consider the application of deep learning to the sparse linear inverse problem encountered in compressive sensing, where one seeks to recover a sparse signal from a few noisy linear measurements. In this paper, we propose two novel neural-network architectures that decouple prediction errors across layers in the same way that the approximate message passing (AMP) algorithms decouple them across iterations: through Onsager correction. We show numerically that our "learned AMP" network significantly improves upon Gregor and LeCun's "learned ISTA" when both use soft-thresholding shrinkage. We then show that additional improvements result from jointly learning the shrinkage functions together with the linear transforms. Finally, we propose a network design inspired by an unfolding of the recently proposed "vector AMP" (VAMP) algorithm, and show that it outperforms all previously considered networks. Interestingly, the linear transforms and shrinkage functions prescribed by VAMP coincide with the values learned through backpropagation, yielding an intuitive explanation for the design of this deep network.
 
 
 
 
 
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment