Dear Igor,Thanks Ivan for the heads-up ! here are the preprints:
the following tensor/low-rank preprints can be of interest to the audience of Nuit Blanche:
Tensor Methods and Recommender Systems
Evgeny Frolov, Ivan Oseledets
- a review on tensor method in the field of recommender systems
Convergence analysis of projected fixed-point iteration on a low-rank matrix manifold
Denis Kolesnikov, Ivan Oseledets
we have proven first curvature-independent estimates for convergence over low-rank manifolds.
"Compress and eliminate" solver for symmetric positive definite sparse matrices
Daria A. Sushnikova, Ivan V. Oseledets
A new low-rank based solver for sparse matrices :)
With best wishes,Ivan
Tensor Methods and Recommender Systems by Evgeny Frolov, Ivan Oseledets
A substantial progress in development of new and efficient tensor factorization techniques has led to an extensive research of their applicability in recommender systems field. Tensor-based recommender models push the boundaries of traditional collaborative filtering techniques by taking into account a multifaceted nature of real environments, which allows to produce more accurate, situational (e.g. context-aware, criteria-driven) recommendations. Despite the promising results, tensor-based methods are poorly covered in existing recommender systems surveys. This survey aims to complement previous works and provide a comprehensive overview on the subject. To the best of our knowledge, this is the first attempt to consolidate studies from various application domains in an easily readable, digestible format, which helps to get a notion of the current state of the field. We also provide a high level discussion of the future perspectives and directions for further improvement of tensor-based recommendation systems.Convergence analysis of projected fixed-point iteration on a low-rank matrix manifold by Denis Kolesnikov, Ivan Oseledets
In this paper we analyse convergence of projected fixed-point iteration on a Riemannian manifold of matrices with fixed rank. As a retraction method we use `projector splitting scheme'. We prove that the projector splitting scheme converges at least with the same rate as standard fixed-point iteration without rank constraints. We also provide counter-example to the case when conditions of the theorem do not hold. Finally we support our theoretical results with numerical experiments.
"Compress and eliminate" solver for symmetric positive definite sparse matrices by Daria A. Sushnikova, Ivan V. Oseledets
We propose a new approximate factorization for solving linear systems with symmetric positive definite sparse matrices. In a nutshell the algorithm is to apply hierarchically block Gaussian elimination and additionally compress the fill-in. The systems that have efficient compression of the fill-in mostly arise from discretization of partial differential equations. We show that the resulting factorization can be used as an efficient preconditioner and compare the proposed approach with state-of-art direct and iterative solvers.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.