The dynamical low-rank approximation of time-dependent matrices is a low-rank factorization updating technique. It leads to differential equations for factors of the matrices, which need to be solved numerically. We propose and analyze a fully explicit, computationally inexpensive integrator that is based on splitting the orthogonal projector onto the tangent space of the low-rank manifold. As is shown by theory and illustrated by numerical experiments, the integrator enjoys robustness properties that are not shared by any standard numerical integrator. This robustness can be exploited to change the rank adaptively. Another application is in optimization algorithms for low-rank matrices where truncation back to the given low rank can be done efficiently by applying a step of the integrator proposed here.
Ivan let me know that
I have implemented the matrix low-rank integrator in Python, it is very simple.Thanks Ivan !
https://gist.github.com/oseledets/4996949
Now we are working on the high-dimensional generalization of the stuff.
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.
No comments:
Post a Comment