Friday, May 05, 2017

Spectral Ergodicity in Deep Learning Architectures via Surrogate Random Matrices - implementation -

Mehmet just sent me the following
Hello Igor,

Hope you are all well.

The recent manuscript [1,3] might be of interest to nuit-blanche blog readers. The manuscript has developed an ergodicity measure for spectral analysis using KL divergence and applied to circular ensembles. The software package is also available along with the Python notebook.

Please feel free to use the material in your blog.

Many regards,


[3] Spectral Ergodicity in Deep Learning Architectures via Surrogate Random Matrices by Mehmet Süzen, Cornelius Weber, Joan J. Cerdà
Using random matrix ensembles, mimicking weight matrices from deep and recurrent neural networks, we investigate how increasing connectivity leads to higher accuracy in learning with a related measure on eigenvalue spectra. For this purpose, we quantify spectral ergodicity based on the Thirumalai-Mountain (TM) metric and Kullbach-Leibler (KL) divergence. As a case study, different size circular random matrix ensembles, i.e., circular unitary ensemble (CUE), circular orthogonal ensemble (COE), and circular symplectic ensemble (CSE), are generated. Eigenvalue spectra are computed along with the approach to spectral ergodicity with increasing connectivity size. As a result, it is argued that success of deep learning architectures attributed to spectral ergodicity conceptually, as this property prominently decreases with increasing connectivity in surrogate weight matrices. 

Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments: