Pages

Monday, November 09, 2015

Guarantees of Riemannian Optimization for Low Rank Matrix Recovery

At last, the use of phase transitions to figure out if manifold based techniques can do well:

Guarantees of Riemannian Optimization for Low Rank Matrix Recovery by Ke Wei, Jian-Feng Cai, Tony F. Chan, Shingyu Leung

We establish theoretical recovery guarantees of a family of Riemannian optimization algorithms for low rank matrix recovery, which is about recovering an $m\times n$ rank $r$ matrix from $p < mn$ number of linear measurements. The algorithms are first interpreted as the iterative hard thresholding algorithms with subspace projections. Then based on this connection, we prove that if the restricted isometry constant $R_{3r}$ of the sensing operator is less than $C_\kappa /\sqrt{r}$ where $C_\kappa$ depends on the condition number of the matrix, the Riemannian gradient descent method and a restarted variant of the Riemannian conjugate gradient method are guaranteed to converge to the measured rank $r$ matrix provided they are initialized by one step hard thresholding. Empirical evaluation shows that the algorithms are able to recover a low rank matrix from nearly the minimum number of measurements necessary.
 
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment