Pages

Thursday, December 17, 2015

L1-Regularized Distributed Optimization: A Communication-Efficient Primal-Dual Framework

Martin let me know of this preprint and attendant implementation this morning:

L1-Regularized Distributed Optimization: A Communication-Efficient Primal-Dual Framework by Virginia Smith, Simone Forte, Michael I. Jordan, Martin Jaggi
Despite the importance of sparsity in many big data applications, there are few existing methods for efficient distributed optimization of sparsely-regularized objectives. In this paper, we present a communication-efficient framework for L1-regularized optimization in distributed environments. By taking a non-traditional view of classical objectives as part of a more general primal-dual setting, we obtain a new class of methods that can be efficiently distributed and is applicable to common L1-regularized regression and classification objectives, such as Lasso, sparse logistic regression, and elastic net regression. We provide convergence guarantees for this framework and demonstrate strong empirical performance as compared to other state-of-the-art methods on several real-world distributed datasets.
 The implementation is here.

Thanks Martin !
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment