Pages

Friday, May 18, 2012

Nonnegative Matrix Factorization -implementations-



Here is a new way of performing NMF (with the attendant implementation): Sparse and Unique Nonnegative Matrix Factorization Through Data Preprocessing by Nicolas Gillis. The abstract reads:
Nonnegative matrix factorization (NMF) has become a very popular technique in machine learning because it automatically extracts meaningful features through a sparse and part-based representation. However, NMF has the drawback of being highly ill-posed, that is, there typically exist many different but equivalent factorizations. In this paper, we introduce a completely new way to obtaining more well-posed NMF problems whose solutions are sparser. Our technique is based on the preprocessing of the nonnegative input data matrix, and relies on the theory of M-matrices and the geometric interpretation of NMF. This approach provably leads to optimal and sparse solutions under the separability assumption of Donoho and Stodden (NIPS, 2003), and, for rank-three matrices, makes the number of exact factorizations finite. We illustrate the effectiveness of our technique on several image datasets.
The attendant code is here and will be featured in the Advanced Matrix Factorization Jungle page.


No comments:

Post a Comment