I have a recent paper on tensor completion in which I show a specific class of positive tensors admits polynomial-time algorithms for tensor completion that achieve improved statistical convergence rates over existing approaches, and thought it might be of interest to your readers. A preprint and source code is available at http://ieor.berkeley.edu/~aaswani/plrt/
Assistant Professor, IEOR, UC Berkeley
Thanks Anil ! This is outstanding.
Positive Low-Rank Tensor Completion by Anil Aswani
Motivated by combinatorial regression problems (which we interpret as low-rank tensor completion), we study noisy completion for positive tensors. Existing approaches convert this into matrix completion, but this is unable to achieve the best statistical rates possible. Here, we show that a specific class of low-rank tensors (namely those parametrized as continuous extensions of hierarchical log-linear models) are amenable to efficient computation (with appropriate choice of risk function) and lead to consistent estimation procedures in which hard-thresholding is used to estimate the low-rank structure in the tensor. Also, recent research has shown that approaches using different convex regularizers to exploit multiple sparse structures are unable to simultaneously exploit all structures; we show that combining hard- and soft-thresholding can provide one computationally tractable solution to this in the case of low-rank and sparse tensor completion. Numerical examples with synthetic data and data from a bioengineered metabolic network show that our estimation procedures are competitive with existing approaches to tensor completion.
The implementation is at: http://ieor.berkeley.edu/~aaswani/plrt/
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.