I am slow but I eventually took the hint from Francis Bach that many of the advanced methods used nowadays essentially boil down to some matrix Factorization or decomposition. Everybody who has done some linear algebra has heard about QR factorization and the famous Lapack subroutines that are powering most of the science and engineering workflows. However, in recent years, we have seen a slew of factorizations that added constraints on the factors of these known factorizations.
I was pleasantly surprised to see recently that the Advanced Matrix Factorization Jungle had about 50 G+ "likes" something that puts it in between the Big Picture in Compressive Sensing (8 G+) and Nuit Blanche (118 G+). This got me thinking about how to extend a little bit the listing.
First, I have added compressive sensing as a subset of MMV
Also after watching the presentation of Rene Vidal  at the last First French-German Mathematical Image Analysis Conference, it struck me that I had not added his and others subspace clustering work. I also did not include the recent Random Kitchen Sinks approach, so I added these two sections:
- Multiple Measurement Vector (MMV) Y = A X with unknown X and rows of X are sparse.
- Compressive Sensing, Y = A X with unknown X and rows of X are sparse, X is one column.
- Kernel Factorizations
- Subspace Clustering
Indeed for the Kernel Factorization, we are talking no less than the Random Kitchen Sinks, i.e. implementation of the separation of variables in kernel learning taking place in machine learning which also include all the Fast Mutipole methods (FMM) used in physics for the past twenty years. Let us note that the FMM changed our world substantially, let's bet that the same concept of variable separation does the same in the next few years in the Machine Learning area. In particular, if you step back a little, you'll notice that there is currently no FMM or Kernel factorization that puts an additional constraint on the factors of the decomposition. Think about it, does the FMM or the Random Kitchen sinks changes as result of knowing that the distributions of unknown are sparse, group-sparse and so on....And then there is the recent case that matrix factorization and its bounds might provide us with a clear insight on how to build nonlinear representation of the identity (neural networks)  :-)
Here is what I added (subject to heavy modification),
Kernel Factorizations, Positive Kernel K(x,x') = G(f(x)*f'(x'))Phase Transitions:
- None so far.Implementations:
- Random Kitchen Sinks (FastFood,....)
Subspace ClusteringPhase Transitions:
- None so far.
- y_i = Y c_i with c_i sparse and c_ii = 0, C is unknown, Y is known: Sparse Subspace Clustering: Algorithm, Theory, and Applications by Ehsan Elhamifar, Rene Vidal. Sparse Subspace Clustering Code Sparse Subspace Clustering (SSC) is an algorithm based on sparse representation theory for segmentation of data lying in a union of subspaces.
- Sparse Q_ic_i , X_ic_i = 0, 1^Tc_i = 1, C is unknown, Y is known; Sparse Manifold Clustering and Embedding by Ehsan Elhamifar, Rene Vidal, Sparse Manifold Clustering and Embedding (SMCE) is an algorithm based on sparse representation theory for clustering and dimensionality reduction of data lying in a union of nonlinear manifolds.
- Sparse Modeling Representative Selection Code, Y = Y C with C is unknown and is row sparse and 1^T C = 1^T , See All by Looking at A Few: Sparse Modeling for Finding Representative Objects by Ehsan Elhamifar, Rene Vidal, Sparse Modeling Representative Selection (SMRS) is an algorithm based on sparse multiple-measurement-vector recovery theory for selecting a subset of data points as the representatives.
- K-Means++: see links in wikipedia page
 See All by Looking at A Few: Sparse Modeling for Finding Representative Objects - implementation -
 Sparse Matrix Factorization: Simple rules for growing neural nets and Provable Bounds for Learning Some Deep Representations
Also of relevance: The Advanced Matrix Factorization Group on LinkedIn
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.