Pages

Friday, June 22, 2018

Fast Convex Pruning of Deep Neural Networks - implementation -

Ali just sent me the following:

Hi Igor,

Just wanted to share our recent paper on pruning neural networks, which makes a strong connection with the compressed sensing literature:

- Paper: "Fast convex pruning of deep neural networks", https://arxiv.org/abs/1806.06457- Code + Implementation instructions: https://dnntoolbox.github.io/Net-Trim/

Thanks;
-Ali

Thanks Ali !



Fast Convex Pruning of Deep Neural Networks by Alireza Aghasi, Afshin Abdi, Justin Romberg
We develop a fast, tractable technique called Net-Trim for simplifying a trained neural network. The method is a convex post-processing module, which prunes (sparsifies) a trained network layer by layer, while preserving the internal responses. We present a comprehensive analysis of Net-Trim from both the algorithmic and sample complexity standpoints, centered on a fast, scalable convex optimization program. Our analysis includes consistency results between the initial and retrained models before and after Net-Trim application and guarantees on the number of training samples needed to discover a network that can be expressed using a certain number of nonzero terms. Specifically, if there is a set of weights that uses at most 
s
 terms that can re-create the layer outputs from the layer inputs, we can find these weights from O(slogN/s)
 samples, where 
N
 is the input size. These theoretical results are similar to those for sparse regression using the Lasso, and our analysis uses some of the same recently-developed tools (namely recent results on the concentration of measure and convex analysis). Finally, we propose an algorithmic framework based on the alternating direction method of multipliers (ADMM), which allows a fast and simple implementation of Net-Trim for network pruning and compression.





Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment