Dear Igor,
We have just finalized a paper which considers the sample complexity of DL and other matrix factorizations. The topic might be of interest for the Nuit Blanche readers and we would be delighted if you could advertise our preprint, which is available hereThanks a lot and best wishes,MartinProf. Dr. Martin KleinsteuberGeometric Optimization & Machine Learning GroupCluster CoTeSys www.cotesys.deTU München www.gol.ei.tum.de
Thanks Martin ! Here is the paper: Sample Complexity of Dictionary Learning and other Matrix Factorizations by Rémi Gribonval, Rodolphe Jenatton, Francis Bach,Martin Kleinsteuber, Matthias Seibert
Many modern tools in machine learning and signal processing, such as sparse dictionary learning, principal component analysis (PCA), non-negative matrix factorization (NMF), K-means clustering, etc., rely on the factorization of a matrix obtained by concatenating high-dimensional vectors from a training collection. While the idealized task would be to optimize the expected quality of the factors over the underlying distribution of training vectors, it is achieved in practice by minimizing an empirical average over the considered collection. The focus of this paper is to provide sample complexity estimates to uniformly control how much the empirical average deviates from the expected cost function. Standard arguments imply that the performance of the empirical predictor also exhibit such guarantees. The level of genericity of the approach encompasses several possible constraints on the factors (tensor product structure, shift-invariance, sparsity \ldots), thus providing a unified perspective on the sample complexity of several widely used matrix factorization schemes.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment