Pages

Monday, November 24, 2014

Randomized Interpolative Decomposition of Separated Representations

Random projections with tensors.




We introduce tensor Interpolative Decomposition (tensor ID) for the reduction of the separation rank of Canonical Tensor Decompositions (CTDs). Tensor ID selects, for a user-defined accuracy \epsilon, a near optimal subset of terms of a CTD to represent the remaining terms via a linear combination of the selected terms. Tensor ID can be used as an alternative to or a step of the Alternating Least Squares (ALS) algorithm. In addition, we briefly discuss Q-factorization to reduce the size of components within an ALS iteration. Combined, tensor ID and Q-factorization lead to a new paradigm for the reduction of the separation rank of CTDs. In this context, we also discuss the spectral norm as a computational alternative to the Frobenius norm.
We reduce the problem of finding tensor IDs to that of constructing Interpolative Decompositions of certain matrices. These matrices are generated via either randomized projection or randomized sampling of the given tensor. We provide cost estimates and several examples of the new approach to the reduction of separation rank.


 Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Post a Comment