Volkan just sent me the following:
Hi Igor,I wanted to follow up on our preprint: http://arxiv.org/abs/1206.1529.In the paper, we consider projecting sparsely onto the simplex-- a constraint that mars the application of the ell1-norm as a regularizer or as a constraint for sparsification. Interestingly, simplex constraints naturally happen in many applications, such as Markowitz portfolio optimization and density learning, where we need to find a K-sparse solution in such a way that the coefficients also sum up to a constant. The empirical results are quite encouraging......best,Volkan
Thanks Volkan for reminding us of this technique and its empirical results. The paper is: Sparse projections onto the simplex by Anastasios Kyrillidis , Stephen Becker and Volkan Cevher. The abstract reads:
The past decade has seen the rise of $\ell_1$-relaxation methods to promote sparsity for better interpretability and generalization of learning results. However, there are several important learning applications, such as Markowitz portolio selection and sparse mixture density estimation, that feature simplex constraints, which disallow the application of the standard $\ell_1$-penalty. In this setting, we show how to efficiently obtain sparse projections onto the positive and general simplex with sparsity constraints. We provide an exact sparse projector for the positive simplex constraints, and derive a novel approach with online optimality and approximation guarantees for sparse projections onto the general simplex constraints. Even for small sized problems, this new approach is three orders of magnitude faster than the alternative, state-of-the-art branch-and-bound based CPLEX solver with no sacrifice in solution quality. We also empirically demonstrate that our projectors provide substantial benefits in portfolio selection and density estimation.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.