Nicolas Gillis has been one of the people who has tried to make mathematical sense as to why an empirical matrix factorization has been so successful for the past 14 years. His latest arxiv preprint (see below) provides some perspectives on where we are in that respect. It is no secret that every paper using nonnegative matrix factorization has produced its own version of the NMF decomposition in some way or another. To call the ensemble a zoo would be, at the very least, charitable. I personally decided to call it a jungle. This sentiment of being lost in the jungle has had a curious side effect: the cannibalization of the semantics used for matrix factorization. Indeed, in many people's mind, Matrix Factorization and NMF hae become one and the same. This is a situation not unlike the one we faced in compressive sensing with the myriad of sparsity seeking solvers. In turn, this issue of being easily lost led me to write this blog entry pointing to a solution: If there is revolution in rank minimization and nobody hears about it, is this a revolution ? or this one. The remedy to this state of affairs in compressive sensing has been to use the "acid test" of the universal phase transitions. One can expect that at some point in time we'll surely get there with NMF as well. Without further due, here is the paper: The Why and How of Nonnegative Matrix Factorization by Nicolas Gillis (Update: version 2 is here , also Matlab

__code__, attendant slides)Nonnegative matrix factorization (NMF) has become a widely used tool for the analysis of high-dimensional data as it automatically extracts sparse and meaningful features from a set of nonnegative data vectors. We first illustrate this property of NMF on three applications, in image processing, text mining and hyperspectral imaging --this is the why. Then we address the problem of solving NMF, which is NP-hard in general. We review some standard NMF algorithms, and also present a recent subclass of NMF problems, referred to as near-separable NMF, that can be solved efficiently (that is, in polynomial time), even in the presence of noise --this is the how. Finally, we briefly describe some problems in mathematics and computer science closely related to NMF via the nonnegative rank.

also relevant: Algorithms for nonnegative matrix and tensor factorizations: a unified view based on block coordinate descent framework by Jingu Kim, Yunlong He, Haesun Park

Abstract We review algorithms developed for nonnegative matrix factorization (NMF) and nonnegative tensor factorization (NTF) from a unified view based on the block coordinate descent (BCD) framework. NMF and NTF are low-rank approximation methods for matrices and tensors in which the low-rank factors are constrained to have only nonnegative elements. The nonnegativity constraints have been shown to enable natural interpretations and allow better solutions in numerous applications including text analysis, computer vision, and bioinformatics. However, the computation of NMF and NTF remains challenging and expensive due the constraints. Numerous algorithmic approaches have been proposed to efficiently compute NMF and NTF. The BCD framework in constrained non-linear optimization readily explains the theoretical convergence properties of several efficient NMF and NTF algorithms, which are consistent with experimental observations reported in literature. In addition, we discuss algorithms that do not fit in the BCD framework contrasting them from those based on the BCD framework. With insights acquired from the unified perspective, we also propose efficient algorithms for updating NMF when there is a small change in thereduced dimension or in the data. The effectiveness of the proposed updating algorithms are validated experimentally with synthetic and real-world data sets.

**Join the CompressiveSensing subreddit or the Google+ Community and post there !**

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

## No comments:

Post a Comment