Pages

Saturday, December 17, 2016

Saturday Morning Video: Deep Compression, DSD Training and EIE: Deep Neural Network Model Compression, Regularization and Hardware Acceleration by Song Han

Here are some videos of Song Han on the topic of Mapping Deep Learning to Hardware:

  

Deep Compression, DSD Training and EIE: Deep Neural Network Model Compression, Regularization and Hardware Acceleration 

Neural networks are both computationally intensive and memory intensive, making them difficult to deploy on mobile phones and embedded systems with limited hardware resources. To address this limitation, this talk first introduces “Deep Compression” that can compress the deep neural networks by 10x-49x without loss of prediction accuracy[1][2][5]. Then this talk will describe DSD, the "Dense-Sparse-Dense" training method that regularizes CNN/RNN/LSTMs to improve the prediction accuracy of a wide range of neural networks given the same model size[3]. Finally this talk will discuss EIE, the "Efficient Inference Engine" that works directly on the deep-compressed DNN model and accelerates the inference, taking advantage of weight sparsity, activation sparsity and weight sharing, which is 13x faster and 3000x more energy efficient than a TitanX GPU[4]. References: [1] Han et al. Learning both Weights and Connections for Efficient Neural Networks (NIPS'15) [2] Han et al. Deep Compression: Compressing Deep Neural Networks with Pruning, Trained Quantization and Huffman Coding (ICLR'16, best paper award) [3] Han et al. DSD: Regularizing Deep Neural Networks with Dense-Sparse-Dense Training (submitted to NIPS'16) [4] Han et al. EIE: Efficient Inference Engine on Compressed Deep Neural Network (ISCA’16) [5] Iandola, Han et al. SqueezeNet: AlexNet-level accuracy with 50x fewer parameters and less than 0.5MB model size (submitted to EVVC'16)
Here are two earlier presentations on the same topic:

and the attendant slides:
 



EIE: Efficient Inference Engine on Compressed Deep Neural Network


Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment