Friday, October 12, 2012

Invariance of Principal Components under Low-Dimensional Random Projection of the Data




I have featured their work recently in From Compressive Sensing to Machine Learning, here is another set of interesting papers: Invariance of Principal Components under Low-Dimensional Random Projection of the Data by Hanchao Qi, Shannon M. Hughes. The abstract reads:
Algorithms that can efficiently recover principal components of high-dimensional data from compressive sensing measurements (e.g. low-dimensional random projections) of it have been an important topic of recent interest in the literature. In this paper, we show that, under certain conditions, normal principal component analysis (PCA) on such low-dimensional random projections of data actually returns the same result as PCA on the original data set would. In particular, as the number of data samples increases, the center of the randomly projected data converges to the true center of the original data (up to a known scaling factor) and the principal components converge to the true principal components of the original data as well, even if the dimension of each random subspace used is very low. Indeed, experimental results verify that this approach does estimate the original center and principal components very well for both synthetic and real-world datasets, including hyperspectral data. Its performance is even superior to that of other algorithms recently developed in the literature for this purpose
I wonder how this is going to fit into the generic RandNLA movement.

and Technical report: Two observations on probability distribution symmetries for randomly-projected data by Hanchao QiShannon M. Hughes. The abstract reads:

In this technical report, we will make two observations concerning symmetries of the probability distribution resulting from projection of a piece of p-dimensional data onto a random m-dimensional subspace of $\mathbb{R}^p$, where m < p. In particular, we shall observe that such distributions are unchanged by reflection across the original data vector and by rotation about the original data vector


Join our Reddit Experiment, Join the CompressiveSensing subreddit and post there !

Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly