Tuesday, April 07, 2015

A Probabilistic Theory of Deep Learning

Rich just sent me the following:

hi igor -

we have a new paper on a theory of deep learning that answers some questions about why they work. It might be interesting to the Nuit Blanche readership. also would appreciate your comments. thanks!

http://arxiv.org/abs/1504.00641

richb

Richard G. Baraniuk
Victor E. Cameron Professor of Electrical and Computer Engineering
Founder and Director, OpenStax
Rice University 

Thanks Rich . I note from the paper:


In contrast, learning from natural videos should result in an accelerated learning process, as typically only a few nuisance variables change from frame to frame. This property should enable substantial acceleration in learning, as inference about which nuisance variables have changed will be faster and more accurate (54).
Without further ado: A Probabilistic Theory of Deep Learning by Ankit B. Patel, Tan Nguyen, Richard G. Baraniuk

A grand challenge in machine learning is the development of computational algorithms that match or outperform humans in perceptual inference tasks that are complicated by nuisance variation. For instance, visual object recognition involves the unknown object position, orientation, and scale in object recognition while speech recognition involves the unknown voice pronunciation, pitch, and speed. Recently, a new breed of deep learning algorithms have emerged for high-nuisance inference tasks that routinely yield pattern recognition systems with near- or super-human capabilities. But a fundamental question remains: Why do they work? Intuitions abound, but a coherent framework for understanding, analyzing, and synthesizing deep learning architectures has remained elusive. We answer this question by developing a new probabilistic framework for deep learning based on the Deep Rendering Model: a generative probabilistic model that explicitly captures latent nuisance variation. By relaxing the generative model to a discriminative one, we can recover two of the current leading deep learning systems, deep convolutional neural networks and random decision forests, providing insights into their successes and shortcomings, as well as a principled route to their improvement.
Addressing the success of both deep neural nets and random forests in the same generic framework is something we've been yearning here on Nuit Blanche. I love the fact that they used rendered images to features the "nuisance"parameters.
 
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly