Friday, May 27, 2016

Riemannian stochastic variance reduced gradient on Grassmann manifold - implementation -

 
 
 
Bamdev just sent me the following:
 
Dear Igor,

I wish to share our recent technical report on "Riemannian stochastic variance reduced gradient on Grassmann manifold", available at http://arxiv.org/abs/1605.07367. In this paper, we extend the Euclidean SVRG algorithm to compact Riemannian manifolds. The results are encouraging.

Additionally, the codes are available at https://bamdevmishra.com/codes/rsvrg/. We also provide a template file, Riemannian_svrg.m, that is compatible with the Manopt toolbox [1].


Regards,
Bamdev
[1] http://manopt.org

Thanks Bamdev ! Here is the paper: Riemannian stochastic variance reduced gradient on Grassmann manifold by Hiroyuki Kasai, Hiroyuki Sato, Bamdev Mishra

Stochastic variance reduction algorithms have recently become popular for minimizing the average of a large, but finite, number of loss functions. In this paper, we propose a novel Riemannian extension of the Euclidean stochastic variance reduced gradient algorithm (R-SVRG) to a compact manifold search space. To this end, we show the developments on the Grassmann manifold. The key challenges of averaging, addition, and subtraction of multiple gradients are addressed with notions like logarithm mapping and parallel translation of vectors on the Grassmann manifold. We present a global convergence analysis of the proposed algorithm with decay step-sizes and a local convergence rate analysis under fixed step-size with some natural assumptions. The proposed algorithm is applied on a number of problems on the Grassmann manifold like principal components analysis, low-rank matrix completion, and the Karcher mean computation. In all these cases, the proposed algorithm outperforms the standard Riemannian stochastic gradient descent algorithm.

 
 
 
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly