Leslie just sent me the following:
Dear Igor,Since sparse representations is a related topic you discuss on Nuit Blanche, I would like to mention a recent joint paper with Michael Elad on improving dictionary learning that has recently been published in IEEE Signal Processing Letters. The paper is titled "Improving Dictionary Learning: Multiple Dictionary Updates and Coefficient Reuse" and can be download at this link: http://www.cs.technion.ac.il/~elad/publications/journals/2012/IEEE-SPL-DL-2012.pdfPerhaps even more interesting is that we have made available the software for the methods described in the paper and information on this software can be found at http://www.cs.technion.ac.il/~elad/software/Thanks for all your efforts with the blog and related study groups (LinkedIn, Google+, etc.).Best regards,Leslie
Here is the paper: Improving Dictionary Learning: Multiple Dictionary Updates and Coefﬁcient Reuse by Leslie N. Smith and Michael Elad. The abstract reads:
In this paper we propose two improvements of the MOD and K-SVD dictionary learning algorithms, by modifying the two main parts of these algorithms – the dictionary update and the sparse coding stages. Our ﬁrst contribution is a different dictionary-update stage that aims at ﬁnding both the dictionary and the representations while keeping the supports intact. The second contribution suggests to leverage the known representations from the previous sparse-coding in the quest for the updated representations. We demonstrate these two ideas in practice and show how they lead to faster training and better quality outcome.
The package is here and will be added to the Matrix Factorization Page and the Big Picture in Compressive Sensing shortly
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.