In the series "The Unreasonable effectiveness of", we've had
Today, we have something that is a subset of random projections: The Unreasonable Effectiveness of Random Orthogonal Embeddings by Krzysztof Choromanski, Mark Rowland, Adrian Weller
We present a general class of embeddings based on structured random matrices with orthogonal rows which can be applied in many machine learning applications including dimensionality reduction, kernel approximation and locality-sensitive hashing. We show that this class yields improvements over previous state-of-the-art methods either in computational efficiency (while providing similar accuracy) or in accuracy, or both. In particular, we propose the \textit{Orthogonal Johnson-Lindenstrauss Transform} (OJLT) which is as fast as earlier methods yet provably outperforms them in terms of accuracy, leading to a `free lunch' improvement over previous dimensionality reduction mechanisms. We introduce matrices with complex entries that further improve accuracy. Other applications include estimators for certain pointwise nonlinear Gaussian kernels, and speed improvements for approximate nearest-neighbor search in massive datasets with high-dimensional feature vectors.
Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment