< Back to previous page

Project

Are performance-enhancing techniques for tensor rank decomposition effective?

Approximating a tensor by a low-rank tensor rank decomposition (canonical polyadic decomposition, CPD) is a challenging nonlinear, non-convex optimization problem suffering from a number of difficulties. In previous research, the applicants performed a first-order sensitivity analysis of this tensor rank decomposition, hereby characterizing its condition number. We showed that several of the computational difficulties faced by optimization methods for this problem are intimately related with a high condition number of the CPD. Prior to our study of condition, several performance-enhancing techniques such as Tucker compression prior to CPD approximation, objective function regularization, and (3) imposing restrictions on the optimization domain such as nonnegativity and coherence constraints were proposed for mitigating aforementioned computational difficulties. Since these techniques attempt to resolve difficulties intimately connected to ill-conditioning of the CPD, we wonder whether they are theoretically founded? The goal of this project is to establish a number of surprising findings: Tucker compression is numerically unstable, objective function regularization increases the odds of some optimization methods to converge to severely ill-conditioned CPDs, and nonnegativity constraints do not on average improve conditioning.

Date:25 Aug 2020 →  Today
Keywords:numerical analysis, differential geometry, sensitivity analysis, tensor rank decompotition
Disciplines:Numerical computation, Mathematical software, Differential geometry, Numerical analysis
Project type:PhD project