Wednesday, March 26, 2014

Tensor time! Era of Big Data Processing: A New Approach via Tensor Networks and Tensor Decompositions, A Non-Local Structure Tensor Based Approach for Multicomponent Image Recovery Problems

From the first paper:

This all shows that, compared to matrix decompositions,  Canonical Polyadic Decomposition (CPD) is unique under more natural and relaxed conditions, that only require the components to be “sufficiently different” and their number not unreasonably large. These conditions do not have a matrix counterpart, and are at the heart of tensor based signal separation

How does this approach resolve current heuristics and attendant phase transitions ( see the Advanced Matrix Factorization Jungle page ) ? We don't know.

The following figure is also from the first paper and shows that, at least in compressive sensing, we have been using a tensor representation that is generally mystifying newcomers ( yes, two dimensional images are in fact vectors thanks to Matlab's shape  command)

The last is about a solver using tensor for multicomponent image recovery. Enjoy !


The widespread use of multi-sensor technology and the emergence of big datasets has highlighted the limitations of standard flat-view matrix models and the necessity to move towards more versatile data analysis tools. We show that higher-order tensors (i.e., multiway arrays) enable such a fundamental paradigm shift towards models that are essentially polynomial and whose uniqueness, unlike the matrix methods, is guaranteed under verymild and natural conditions. Benefiting fromthe power ofmultilinear algebra as their mathematical backbone, data analysis techniques using tensor decompositions are shown to have great flexibility in the choice of constraints that match data properties, and to find more general latent components in the data than matrix-based methods. A comprehensive introduction to tensor decompositions is provided from a signal processing perspective, starting from the algebraic foundations, via basic Canonical Polyadic and Tucker models, through to advanced cause-effect and multi-view data analysis schemes. We show that tensor decompositions enable natural generalizations of some commonly used signal processing paradigms, such as canonical correlation and subspace techniques, signal separation, linear regression, feature extraction and classification. We also cover computational aspects, and point out how ideas from compressed sensing and scientific computing may be used for addressing the otherwise unmanageable storage and manipulation problems associated with big datasets. The concepts are supported by illustrative real world case studies illuminating the benefits of the tensor framework, as efficient and promising tools for modern signal processing, data analysis and machine learning applications; these benefits also extend to vector/matrix data through tensorization. Keywords: ICA, NMF, CPD, Tucker decomposition, HOSVD, tensor networks, Tensor Train.


Modern applications such as computational neuroscience, neuroinformatics and pattern/image recognition generate massive amounts of mult0dimensinal data with multiple aspects and high dimensionality. Big data require novel technologies to efficiently process massive datasets within tolerable elapsed times. Such a new emerging technology for multidimensional big data is a multiway analysis via tensor networks (TNs) and tensor decompositions (TDs) which decompose tensors to sets of factor (component) matrices and low-order (core) tensors. Tensors (i.e., multiway arrays) provide a natural and compact representation for such massive multidimensional data via suitable lowrank approximations. Dynamic tensor analysis allows us to discover meaningful hidden structures of complex data and perform generalizations by capturing multi-linear and multi-aspect relationships. We will discuss some emerging TN models, their mathematical and graphical descriptions and associated learning algorithms for large-scale TDs and TNs, with many potential applications including: anomaly detection, feature extraction, classification, cluster analysis, data fusion and integration, pattern recognition, predictive modeling, regression, time series analysis and multiway component analysis.
Key words: tensor decompositions, tensor networks, tensor train, HOSVD, Tucker/CPD.


Non-Local Total Variation (NLTV) has emerged as a useful tool in variational methods for image recovery problems. In this paper, we extend the NLTV-based regularization to multicomponent images by taking advantage of the Structure Tensor (ST) resulting from the gradient of a multicomponent image. The proposed approach allows us to penalize the non-local variations, jointly for the different components, through various $\ell_{1,p}$ matrix norms with $p \ge 1$. To facilitate the choice of the hyper-parameters, we adopt a constrained convex optimization approach in which we minimize the data fidelity term subject to a constraint involving the ST-NLTV regularization. The resulting convex optimization problem is solved with a novel epigraphical projection method. This formulation can be efficiently implemented thanks to the flexibility offered by recent primal-dual proximal algorithms. Experiments are carried out for multispectral and hyperspectral images. The results demonstrate the interest of introducing a non-local structure tensor regularization and show that the proposed approach leads to significant improvements in terms of convergence speed over current state-of-the-art methods.


Join the CompressiveSensing subreddit or the Google+ Community and post there !
Liked this entry ? subscribe to Nuit Blanche's feed, there's more where that came from. You can also subscribe to Nuit Blanche by Email, explore the Big Picture in Compressive Sensing or the Matrix Factorization Jungle and join the conversations on compressive sensing, advanced matrix factorization and calibration issues on Linkedin.

No comments:

Printfriendly