So we are now converging toward a theory for using sparse random projections for a variety of problems.

Toward a unified theory of sparse dimensionality reduction in Euclidean space by

**Jean Bourgain, Sjoerd Dirksen, Jelani Nelson**LetΦ∈Rm×n be a sparse Johnson-Lindenstrauss transform [KN14] withs non-zeroes per column. For a subsetT of the unit sphere,ε∈(0,1/2) given, we study settings form,s required to ensurei.e. so thatEΦsupx∈T∣∣∥Φx∥22−1∣∣<ε, Φ preserves the norm of everyx∈T simultaneously and multiplicatively up to1+ε . We introduce a new complexity parameter, which depends on the geometry ofT , and show that it suffices to chooses andm such that this parameter is small. Our result is a sparse analog of Gordon's theorem, which was concerned with a denseΦ having i.i.d. Gaussian entries. We qualitatively unify several results related to the Johnson-Lindenstrauss lemma, subspace embeddings, and Fourier-based restricted isometries. Our work also implies new results in using the sparse Johnson-Lindenstrauss transform in numerical linear algebra, classical and model-based compressed sensing, manifold learning, and constrained least squares problems such as the Lasso.

**Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !**

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

## No comments:

Post a Comment