Publications
Chosen filters:
Chosen filters:
Unsupervised learning of disentangled representations in deep restricted kernel machines with orthogonality constraints KU Leuven
We introduce Constr-DRKM, a deep kernel method for the unsupervised learning of disentangled data representations. We propose augmenting the original deep restricted kernel machine formulation for kernel PCA by orthogonality constraints on the latent variables to promote disentanglement and to make it possible to carry out optimization without first defining a stabilized objective. After discussing a number of algorithms for end-to-end training, ...
Random Features for Kernel Approximation: A Survey in Algorithms, Theory, and Beyond. KU Leuven
The class of random features is one of the most popular techniques to speed up kernel methods in large-scale problems. Related works have been recognized by the NeurIPS Test-of-Time award in 2017 and the ICML Best Paper Finalist in 2019. The body of work on random features has grown rapidly, and hence it is desirable to have a comprehensive overview on this topic explaining the connections among various algorithms and theoretical results. In ...
Tensor-based Restricted Kernel Machines for Multi-View Classification KU Leuven Vrije Universiteit Brussel
Generative restricted Kernel machines : A framework for Multi-view Generation and disentangled feature learning KU Leuven
This paper introduces a novel framework for generative models based on Restricted Kernel Machines (RKMs) with joint multi-view generation and uncorrelated feature learning, called Gen-RKM. To enable joint multi-view generation, this mechanism uses a shared representation of data from various views. Furthermore, the model has a primal and dual formulation to incorporate both kernel-based and (deep convolutional) neural network based models within ...