< Back to previous page

Project

Constrained decompositions of explicitly or implicitly given tensors

Gargantuan amounts of data are created every day in this information age. These data are used in domains like signal processing, data analysis and machine learning to discover patterns or underlying sources, to perform prediction and so on. Matrix decompositions have been a key tool to accomplish this, but nowadays one more and more often relies on tensors as their decompositions, and therefore the underlying components, are easier to find thanks to milder uniqueness conditions. These tensors can be imagined as multiway arrays of numbers.
To facilitate the discovery of meaningful components, many applications require that constraints are imposed to incorporate prior knowledge. We will provide novel algorithms that allow more relaxed (in)equality constraints which cannot only be imposed on the individual components, but also on the global decomposition, and we will develop new initialization strategies to improve recovery. Via orthogonality or condition constraints a cheap alternative to techniques from scientific computing can be obtained.
Next, we study techniques in which the tensor is given implicitly, which may be the case when the construction of an explicit array is too expensive to compute or to store. Bypassing the explicit construction, we will develop "implicit" equivalents of a range of important algorithms, opening up new applications in data analysis, machine learning, multidimensional harmonic retrieval and multidimensional system modeling and identification

Date:1 Oct 2019 →  30 Sep 2022
Keywords:multilinear algebra, tensor, numerical optimization, signal processing, data analysis, large-scale, machine learning
Disciplines:Data mining, Information technologies, Analogue and digital signal processing