< Back to previous page
Efficient multiple scale kernel classifiers
Book Contribution - Book Chapter Conference Contribution
© 2016 IEEE. While kernel methods using a single Gaussian kernel have proven to be very successful for nonlinear classification, in case of learning problems with a more complex underlying structure it is often desirable to use a linear combination of kernels with different widths. To address this issue, this paper presents a classification algorithm based on a jointly convex constrained optimization formulation. The primal problem is defined as jointly learning a combination of kernel classification models formulated in different feature spaces, which account for various representations or scales. The solution can be found by either solving a system of linear equations in case of equal combination weights or by means of a block coordinate descent scheme. The dual model is represented by a classifier using multiple kernels in the decision function. Furthermore, time and space complexity are reduced by adopting a divide and conquer strategy and through the use of the Nyström approximation of the eigenfunctions. Several experiments show the effectiveness of the proposed algorithms in dealing with datasets containing up to millions of instances.
Book: Proc. of the International Conference on Big Data
Pages: 128 - 133
Authors from:Higher Education