Low-Rank Quadratic Semidefinite Programming by Ganzhao Yuan, Zhenjie Zhan, Bernard Ghanem, Zhifeng Hao
Low rank matrix approximation is an attractive model in large scale machine learningproblems, because it can not only reduce the memory and runtime complexity, but alsoprovide a natural way to regularize parameters while preserving learning accuracy. In thispaper, we address a special class of nonconvex quadratic matrix optimization problems,which require a low rank positive semidefinite solution. Despite their non-convexity, weexploit the structure of these problems to derive an efficient solver that converges to theirlocal optima. Furthermore, we show that the proposed solution is capable of dramaticallyenhancing the efficiency and scalability of a variety of concrete problems, which are ofsignificant interest to the machine learning community. These problems include the Top-k Eigenvalue Problem, Distance Learning and Kernel Learning. Extensive experimentson UCI benchmarks have shown the effectiveness and efficiency of our proposed method.
The implementation is here.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.