Tuesday, August 07, 2012

Applying alternating direction method of multipliers for constrained dictionary learning -implementation-


This paper revisits the problem of dictionary learning that we address through a numerical scheme that optimizes alternatively the dictionary elements and the coe cient matrix. Our rst contribution consists then in providing a simple proof of convergence of this scheme for a large class of constraints and regularizers on the dictionary atoms. Then, we investigates the use of a well-known optimization method named alternating direction method of multipliers for solving each of the alternate step of the dictionary learning problem. We show that such an algorithm yields to several bene ts. Indeed, it can be more e cient than other competing algorithms such as Iterative Shrinkage Thresholding approach and besides, it allows one to easily deal with mixed constraints or regularizers over the dictionary atoms or approximation coeffi cients. For instance, we have induced joint sparsity, positivity and smoothness of dictionary atoms by means of total variation and sparsity-inducing regularizers. Our experimental results prove that using these mixed constraints helps in achieving better learned dictionary especially when learning from noisy signals.
Image Credit: NASA/JPL-Caltech,  This image was taken by Front Hazcam: Right A (FHAZ_RIGHT_A) onboard NASA's Mars rover Curiosity on Sol 0 (2012-08-06 05:20:36 UTC) .

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly