< Back to previous page

Publication

Regularized Semipaired Kernel CCA for domain adaptation

Journal Contribution - Journal Article

Domain adaptation learning is one of the fundamental research topics in pattern recognition and machine learning. This paper introduces a regularized semipaired kernel canonical correlation analysis formulation for learning a latent space for the domain adaptation problem. The optimization problem is formulated in the primal-dual least squares support vector machine setting where side information can be readily incorporated through regularization terms. The proposed model learns a joint representation of the data set across different domains by solving a generalized eigenvalue problem or linear system of equations in the dual. The approach is naturally equipped with out-of-sample extension property, which plays an important role for model selection. Furthermore, the Nyström approximation technique is used to make the computational issues due to the large size of the matrices involved in the eigendecomposition feasible. The learned latent space of the source domain is fed to a multiclass semisupervised kernel spectral clustering model that can learn from both labeled and unlabeled data points of the source domain in order to classify the data instances of the target domain. Experimental results are given to illustrate the effectiveness of the proposed approaches on synthetic and real-life data sets.
Journal: IEEE Transactions on Neural Networks
ISSN: 1045-9227
Issue: 7
Volume: 29
Pages: 3199 - 3213
Publication year:2018
BOF-keylabel:yes
IOF-keylabel:yes
BOF-publication weight:10
CSS-citation score:2
Authors from:Higher Education
Accessibility:Open