Coreset Construction via Randomized Matrix Multiplication by Jiasen Yang, Agniva Chowdhury, Petros Drineas

Coresets are small sets of points that approximate the properties of a larger point-set. For example, given a compact set \mathcal{S} \subseteq \mathbb{R}^d, a coreset could be defined as a (weighted) subset of \mathcal{S} that approximates the sum of squared distances from \mathcal{S} to every linear subspace of \mathbb{R}^d. As such, coresets can be used as a proxy to the full dataset and provide an important technique to speed up algorithms for solving problems including principal component analysis, latent semantic indexing, etc. In this paper, we provide a structural result that connects the construction of such coresets to approximating matrix products. This structural result implies a simple, randomized algorithm that constructs coresets whose sizes are independent of the number and dimensionality of the input points. The expected size of the resulting coresets yields an improvement over the state-of-the-art deterministic approach. Finally, we evaluate the proposed randomized algorithm on synthetic and real data, and demonstrate its effective performance relative to its deterministic counterpart.

**Join the CompressiveSensing subreddit or the Google+ Community or the Facebook page and post there !**

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

## No comments:

Post a Comment