Wednesday, September 18, 2013

HTOpt: Optimization on the Hierarchical Tucker manifold - applications to tensor completion -implementation -

Curt Da Silva just sent me the following:

Hey Igor,

My name is Curt Da Silva and I am a PhD student in Math at the University of British Columbia under the supervision of Felix Herrmann. I am writing as a follow up to my email a couple of months ago about Hierarchical Tucker tensor optimization/interpolation. I'm happy to say that we've finished our full preprint which is available at ttps://www.slim.eos.ubc.ca/Publications/Public/TechReport/2013/dasilva2013htuck/dasilva2013htuck.pdf and a Matlab implementation of our software, called HTOpt, is available for download (free registration is required to download the software) at https://www.slim.eos.ubc.ca/SoftwareLicensed/ . Thank you for your consideration.

Sincerely,
Curt.
Thanks Curt !


I wonder if this Hierarchical Tucker manifold decomposition is stable (one of the reason the TT-Toolbox), here is the paper: Optimization on the Hierarchical Tucker manifold - applications to tensor completion by Curt Da Silva Felix J. Herrmann. The abstract reads:
In this work, we develop an optimization framework for problems whose solutions are well-approximated by Hierarchical Tucker (HT) tensors, an efficient structured tensor format based on recursive subspace factorizations. By exploiting the smooth manifold structure of these tensors, we construct standard optimization algorithms such as Steepest Descent and Conjugate Gradient for completing tensors from missing entries. Our algorithmic framework is fast and scalable to large problem sizes as we do not require SVDs on the ambient tensor space, as required by other methods. Moreover, we exploit the structure of the Gramian matrices associated with the HT format, which reduces overfitting for high subsampling ratios. We also find that the organization of the tensor can have a major impact on completion from realistic seismic acquisition geometries. These samplings are far from idealized randomized samplings that are usually considered. Using these algorithms, we get good performance on large-scale problems.
 

Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly