Pages

Thursday, March 05, 2015

Phase Transitions in Sparse PCA

Another phase transition found for Sparse PCA thanks to AMP solvers and to be included in the Advanced Matrix Factorization Jungle page.


Phase Transitions in Sparse PCA by Thibault Lesieur, Florent Krzakala, Lenka Zdeborova

We study optimal estimation for sparse principal component analysis when the number of non-zero elements is small but on the same order as the dimension of the data. We employ approximate message passing (AMP) algorithm and its state evolution to analyze what is the information theoretically minimal mean-squared error and the one achieved by AMP in the limit of large sizes. For a special case of rank one and large enough density of non-zeros Deshpande and Montanari [1] proved that AMP is asymptotically optimal. We show that both for low density and for large rank the problem undergoes a series of phase transitions suggesting existence of a region of parameters where estimation is information theoretically possible, but AMP (and presumably every other polynomial algorithm) fails. The analysis of the large rank limit is particularly instructive.  
 
From the conclusion:
 
....For signal distributions of zero mean we unveil an un-detectability regime where no algorithm can do better in estimation of the signal than random guessing. We observe a  first order (discontinuous) phase transition in regions of small density or large rank r. Existence of such a first order phase transition is related to the existence of a region of parameters where AMP is asymptotically sub-optimal. In the large rank limit the emerging picture is particularly simple (at least for signal distributions of zero mean). The asymptotic MMSE is equal to the MMSE of the problem with known support, and the probability of false negatives in the support recovery is exponentially small when r! 1. The MMSE is not reachable for AMP unless the noise variance is smaller than the variance of the signal squared. We expect that this hard region will stay computationally hard also for other polynomial algorithm. Proving such a result about algorithmic barrier for a generic class of algorithms is a very interesting challenge
 
 
 
Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment