Pages

Friday, September 04, 2015

Learning Deep $\ell_0$ Encoders

Back in 2010,  Karol Gregor and Yann LeCun looked at Learning Fast Approximations of Sparse Coding: That is: design a neural network that could produce sparse coding factorizations.  In other words, take the iterative structure of a sparsity seeking solver and put it in the framework of a deep network architecture. The nolvety being that backpropagation allows one to design the element of the solver piece by piece. Today, we have an insight on how such design can use L1 minimization in order to make these approximations in compressive sensing and sparse coding, sharper. Let us note the use of the HELU activation function. In the end though, the acid test of computng the phase transitions of these solvers would go a long way toward enabling the comparison between techniques. For people with a Machine Learning background, phase transitions have been the only way the compressive sensing community has had to evaluate good solvers from less optimal ones.
  
As in the 2010 paper, one can see major improvement from a two iteration solver compared to a  traditional one.

Learning Deep $\ell_0$ Encoders by  Zhangyang Wang, Qing Ling, Thomas S. Huang

Despite its nonconvex, intractable nature, $\ell_0$ sparse approximation is desirable in many theoretical and application cases. We study the $\ell_0$ sparse approximation problem with the tool of deep learning, by proposing Deep $\ell_0$ Encoders. Two typical forms, the $\ell_0$ regularized problem and the $M$-sparse problem, are investigated. Based on solid iterative algorithms, we model them as feed-forward neural networks, through introducing novel neurons and pooling functions. The deep encoders enjoy faster inference, larger learning capacity, and better scalability compared to conventional sparse coding solutions. Furthermore, when applying them to classification and clustering, the models can be conveniently optimized from end to end, using task-driven losses. Numerical results demonstrate the impressive performances of the proposed encoders.


Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment