< Back to previous page

Project

Taming Nonconvexity in Structured Low-Rank Optimization

Recent advances have made possible the acquisition, storing, and processing of very large amounts of data, strongly impacting many branches of science and engineering. A way to interpret these data and explore their features is to use techniques that represent them as certain low- dimensional objects that allow for an intuitive interpretation by domain-specific experts. Such techniques typically factorize the data as two or more structured objects – e.g., orthogonal matrices, sparse tensors – with a lower rank than the original data. The factorizations can usually be formulated as solutions to large-scale nonconvex optimization problems; it is of interest to develop fast algorithms to solve them and, in particular, algorithms for which one can prove that they always converge to useful solutions. This project aims (i) to introduce and study a general formulation for nonsmooth, structured low-rank optimization, (ii) to establish conditions under which this formulation is tractable (even if nonconvex), (iii) to design provably convergent algorithms to address it, and (iv) to apply and test the new model and algorithms in applications coming from different domains of science and engineering.

Date:1 Oct 2022 →  Today
Keywords:Scientific Computing, Numerical Optimization, Systems and Control, Machine Learning
Disciplines:Numerical analysis, Data mining
Project type:PhD project