Quoc Le's Deep Learning lectures on Deep Learning:
- Part1 Nonlinear Classi ers and The Backpropagation Algorithm
- Part 2 Autoencoders, Convolutional Neural Networks and Recurrent Neural Network
A Primer on Neural Network Models for Natural Language Processing by Yoav Goldberg
Over the past few years, neural networks have re-emerged as powerful machine-learning models, yielding state-of-the-art results in fields such as image recognition and speech processing. More recently, neural network models started to be applied also to textual natural language signals, again with very promising results. This tutorial surveys neural network models from the perspective of natural language processing research, in an attempt to bring natural-language researchers up to speed with the neural techniques. The tutorial covers input encoding for natural language tasks, feed-forward networks, convolutional networks, recurrent networks and recursive networks, as well as the computation graph abstraction for automatic gradient computation.
h/t Ben Hammer
Deep learning tutorial: autoencoders, convolutional neural nets, recurrent neural nets http://t.co/mCpEzLGA6s pic.twitter.com/uIyUccun5r
— Ben Hamner (@benhamner) October 18, 2015
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment