Tuesday, December 10, 2013

BILGO: Bilateral Greedy Optimization for Large Scale Semidefinite Programing - implementation -

BILGO: Bilateral Greedy Optimization for Large Scale Semidefinite Programing by Zhifeng Hao Ganzhao YuanBernard Ghanem
Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semide finite programs. To enable the application of these tasks on a large-scale, scalability and computational e ciency are considered desirable properties for a practical semide finite programming algorithm. In this paper, we theoretically analyse a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefi nite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semide nite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be effi ciently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coe fficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satis ed, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1=k), where k is the iteration counter. The algorithm thus successfully combines the e ciency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the e ectiveness and e ciency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semide nite programs effi ciently.

An implementation of BILGO is here

Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly