Sunday, December 07, 2014

Sunday Morning Insight: An exact mapping between the Variational Renormalization Group and Deep Learning

Here is Renomalization as defined in Wikipedia:

In more technical terms, let us assume that we have a theory described by a certain function Z of the state variables \{s_i\} and a certain set of coupling constants \{J_k\}. This function may be a partition function, an action, a Hamiltonian, etc. It must contain the whole description of the physics of the system.
Now we consider a certain blocking transformation of the state variables \{s_i\}\to \{\tilde s_i\}, the number of \tilde s_i must be lower than the number of s_i. Now let us try to rewrite the Z function only in terms of the \tilde s_i. If this is achievable by a certain change in the parameters, \{J_k\}\to
\{\tilde J_k\}, then the theory is said to be renormalizable.
Renomalization is therefore a way of keeping a certain functionals invariant even after a nonlinear transformation.  Today's paper examine this technique with an eye toward Deep Learning architectures.



An exact mapping between the Variational Renormalization Group and Deep Learning by Pankaj Mehta, David J. Schwab
Deep learning is a broad set of techniques that uses multiple layers of representation to automatically learn relevant features directly from structured data. Recently, such techniques have yielded record-breaking results on a diverse set of difficult machine learning tasks in computer vision, speech recognition, and natural language processing. Despite the enormous success of deep learning, relatively little is understood theoretically about why these techniques are so successful at feature learning and compression. Here, we show that deep learning is intimately related to one of the most important and successful techniques in theoretical physics, the renormalization group (RG). RG is an iterative coarse-graining scheme that allows for the extraction of relevant features (i.e. operators) as a physical system is examined at different length scales. We construct an exact mapping from the variational renormalization group, first introduced by Kadanoff, and deep learning architectures based on Restricted Boltzmann Machines (RBMs). We illustrate these ideas using the nearest-neighbor Ising Model in one and two-dimensions. Our results suggests that deep learning algorithms may be employing a generalized RG-like scheme to learn relevant features from data.

Relevant elements of interest:
 
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly