Yes, this is an issue and the blogs are helping seeing through some of it:
Major recurring theme of deep learning twitter is how even those 100% dedicated to the field can't keep up with progress. https://t.co/9MzFQFuHyT— tarantula web (@beaucronin) 22 mars 2017
Jort and his team have released Audioset
Bob
Muthu
Laurent
- Co-simulation, state-of-the-art by Claudio Gomes
- BARCHAN: Blob Alignment for Robust CHromatographic ANalysis (GCxGC)
- Signal and image classification with invariant descriptors (scattering transforms): Internship
Mitya
Felix
- Notes on “Deformable Convolutional Networks”
- Speed/accuracy trade-offs for modern convolutional object detectors
- Deep Nets Don’t Learn via Memorization
Adrian
- Convolution neural networks, Part 3
- Convolution neural nets, Part 2
- Convolutional neural networks, Part 1
- Recurrent Neural Network models
Ferenc
Francois
- Running Jupyter notebooks on GPU on AWS: a starter guide Francois
- Introducing Keras 2
- Building powerful image classification models using very little data
- Building Autoencoders in Keras
- Keras as a simplified interface to TensorFlow: tutorial
Terry
Here is an 'old' blog entry from Dustin on some of Yves' work in compressed sensing
Dustin
Vladimir
Sebastien
Andrew
Sebastien
- New journal: Mathematical Statistics and Learning
- STOC 2017 accepted papers
- Geometry of linearized neural networks
Andrew
- Deep Learning without Backpropagation Tutorial: DeepMind's Synthetic Gradients
- Building Safe A.I., A Tutorial for Encrypted Deep Learning
- Tutorial: Deep Learning in PyTorch, An Unofficial Startup Guide.
- Grokking Deep Learning: Anyone Can Learn to Code and Understand Deep Learning
- How to Code and Understand DeepMind's Neural Stack Machine:Learning to Transduce with Unbounded Memory
Zac
This image was taken by Front Hazcam: Right B (FHAZ_RIGHT_B) onboard NASA's Mars rover Curiosity on Sol 1644 (2017-03-22 09:54:03 UTC). Image Credit: NASA/JPL-Caltech Full Resolution |
No comments:
Post a Comment