Friday, July 31, 2015

The Spectral Norm of Random Inner-Product Kernel Matrices

When dealing with covariance matrices, one always wonder how thresholding coefficients will yield a matrix with similar properties. Today, we have some answers in that area:

We study the spectra of p×p random matrices K with off-diagonal (i,j) entry equal to n1/2k(XTiXj/n1/2), where Xi's are the rows of a p×n matrix with i.i.d. entries and k is a scalar function. It is known that under mild conditions, as n and p increase proportionally, the empirical spectral measure of K converges to a deterministic limit μ. We prove that if k is a polynomial and the distribution of entries of Xi is symmetric and satisfies a general moment bound, then K is the sum of two components, the first with spectral norm converging to μ (the maximum absolute value of the support of μ) and the second a perturbation of rank at most two. In certain cases, including when k is an odd polynomial function, the perturbation is 0 and the spectral norm K converges to μ. If the entries of Xi are Gaussian, we also prove that K converges to μ for a large class of odd non-polynomial functions k. In general, the perturbation may contribute spike eigenvalues to K outside of its limiting support, and we conjecture that they have deterministic limiting locations as predicted by a deformed GUE model. Our study of such matrices is motivated by the analysis of statistical thresholding procedures to estimate sparse covariance matrices from multivariate data, and our results imply an asymptotic approximation to the spectral norm error of such procedures when the population covariance is the identity.

Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.