Laurent Jacques tweeted about these two sketching preprints recently. The second one is the new version of a preprint we mentioned earlier:
Sketching for Simultaneously Sparse and Low-Rank Covariance Matrices
Sohail Bahmani, Justin Romberg
Sketching for Simultaneously Sparse and Low-Rank Covariance Matrices
Sohail Bahmani, Justin Romberg
We introduce a technique for estimating a structu red covariance matrix from observations of a random vector which have been sketched. Each observed random vectorxt is reduced to a single number by taking its inner product against one of a number of pre-selected vectoraℓ . These observations are used to form estimates of linear observations of the covariance matrixvarSigma , which is assumed to be simultaneously sparse and low-rank. We show that if the sketching vectorsaℓ have a special structure, then we can use straightforward two-stage algorithm that exploits this structure. We show that the estimate is accurate when the number of sketches is proportional to the maximum of the rank times the number of significant rows/columns ofΣ . Moreover, our algorithm takes direct advantage of the low-rank structure ofΣ by only manipulating matrices that are far smaller than the original covariance matrix.
Isometric sketching of any set via the Restricted Isometry Property
Samet Oymak, Benjamin Recht, Mahdi Soltanolkotabi
Samet Oymak, Benjamin Recht, Mahdi Soltanolkotabi
In this paper we show that for the purposes of dimensionality reduction certain class of structured random matrices behave similarly to random Gaussian matrices. This class includes several matrices for which matrix-vector multiply can be computed in log-linear time, providing efficient dimensionality reduction of general sets. In particular, we show that using such matrices any set from high dimensions can be embedded into lower dimensions with near optimal distortion. We obtain our results by connecting dimensionality reduction of any set to dimensionality reduction of sparse vectors via a chaining argument.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment