## Thursday, November 08, 2012

### Eigenvalues of a matrix in the streaming model

Following up on this morning entry on a better embedding for least squares and SVD, here is a result for eigenvalues: Evidently all these entries are listed under the RandNLA tag: I can see a whole bunch of application for something like this. Here it is: Eigenvalues of a matrix in the streaming model by Alexandr Andoni and Huy L. Nguyen. The abstract reads:
We study the question of estimating the eigenvalues of a matrix in the streaming model, addressing a question posed in [Mut05]. We show that the eigenvalue \heavy hitters" of a matrix can be computed in a single pass. In particular, we show that the -heavy hitters (in the `1 or `2 norms) can be estimated in space proportional to 1= 2. Such a dependence on is optimal. We also show how the same techniques may give an estimate of the residual error tail of a rank-k approximation of the matrix (in the Frobenius norm), in space proportional to k 2 All our algorithms are linear and hence can support arbitrary updates to the matrix in the stream. In fact, what we show can be seen as a form of a bi-linear dimensionality reduction: if we multiply an input matrix with projection matrices on both sides, the resulting matrix preserves the top eigenvalues and the residual Frobenius norm.

Image Credit: NASA/JPL/Space Science Institute,
W00076689.jpg was taken on November 06, 2012 and received on Earth November 06, 2012. The camera was pointing toward SATURN-RINGS at approximately 1,292,973 miles (2,080,838 kilometers) away, and the image was taken using the CL1 and CL2 filters.

Join our Reddit Experiment, Join the CompressiveSensing subreddit and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.