So we are now converging toward a theory for using sparse random projections for a variety of problems.
Toward a unified theory of sparse dimensionality reduction in Euclidean space by Jean Bourgain, Sjoerd Dirksen, Jelani Nelson
Φ∈Rm×nbe a sparse Johnson-Lindenstrauss transform [KN14] with snon-zeroes per column. For a subset Tof the unit sphere, ε∈(0,1/2)given, we study settings for m,srequired to ensurei.e. so that EΦsupx∈T∣∣∥Φx∥22−1∣∣<ε, Φpreserves the norm of every x∈Tsimultaneously and multiplicatively up to 1+ε. We introduce a new complexity parameter, which depends on the geometry of T, and show that it suffices to choose sand msuch that this parameter is small. Our result is a sparse analog of Gordon's theorem, which was concerned with a dense Φhaving i.i.d. Gaussian entries. We qualitatively unify several results related to the Johnson-Lindenstrauss lemma, subspace embeddings, and Fourier-based restricted isometries. Our work also implies new results in using the sparse Johnson-Lindenstrauss transform in numerical linear algebra, classical and model-based compressed sensing, manifold learning, and constrained least squares problems such as the Lasso.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.