< Back to previous page

Publication

Projection Methods for Parametrized and Multiparameter Eigenvalue Problems

Book - Dissertation

Matrix eigenvalue problems are common in computational science and engineering. In this thesis we focus on generalisations of the eigenvalue problem, namely parametrized and multiparameter eigenvalue problems. The matrices involved are large, so calculating eigenvalues is computationally expensive. Therefore, we develop techniques that exploit the structure and properties of the underlying problem. In each of the proposed methods in this thesis, a subspace is iteratively built and, instead of solving large eigenvalue problems, we repeatedly solve eigenvalue problems projected on a small subspace. As these projected eigenvalue problems are smaller, they are computationally tractable. The question then is how we can build this subspace as efficiently as possible. In each of the three problems considered in this thesis, the way in which they are built up is different. The first two algorithms that we propose deal with parameterized eigenvalue problems. Here the matrix pencil depends on parameters and we want to determine a global approximation of an extreme eigenvalue (largest or smallest in absolute value or with largest real part) over the parameter space. We discretise the parameter space into a set and we approximate the wanted eigenvalue in all points of this set. In the first algorithm we focus on matrix pencils whose matrices are symmetric and at least one of the two matrices is positive definite. The idea here is that every vector in the subspace that we build is computed from information from one point in the parameter space. This point is taken from a large set of samples in which the approximation of the eigenvalue is the worst according to its residual. We can therefore say that we only use local information to build our subspace. This changes in the second algorithm we propose. Here we also solve parametrized eigenvalue problems but now we work only with standard eigenvalue problems and the matrix does not need to be symmetric. We extend an existing Krylov algorithm in order to perform iterations for multiple parameter points simultaneously. The used subspace arises from information coming from all parameter points. The last problem we deal with concerns multiparameter eigenvalue problems. These are equivalent to solving a potentially very large generalized eigenvalue problem. The proposed projection method is based on the Tensor-Train representation of the problem. The resulting subspace is represented in Tensor-Train format. In one iteration, we update one core of the tensor representing the desired eigenvectors, while leaving the other cores fixed. In the first algorithm we focus specifically on matrix pencils whose matrices are symmetric and at least one of the two matrices must be positive definite. The idea here is that every vector in the subspace that we build is computed from information from one point in the parameter space. This point is the point in which the approximation of the eigenvalue is the worst according to its residual. We can therefore say that we only use local info to build up our subspace. This changes in the second algorithm we propose. Here we also solve parametrized eigenvalue but now we work only with standard eigenvalue problem but the matrix does not need to be symmetric. Here we take an existing subspace method to determine eigenvalues and we extend this method so that we can apply the method to many points at the same time. The used subspace arises from information coming from all parameter points. The last problem we deal with is solving multiparameter eigenvalue problems. These kinds of problems are equivalent to solving a very large generalized eigenvalue problem. The projection method created for this is based on the Tensor-Train representation of the problem. The subspace that results from this is also represented within the framework of Tensor-Trains.
Publication year:2021
Accessibility:Open