ELLIS header
University of Stuttgart Logo
Max Planck Institute for Intelligent Systems Logo

Universal Kernels on Non-Standard Input Spaces

Andreas Christmann, Ingo Steinwart

Advances in Neural Information Processing Systems (NeurIPS), pp. 406–414, 2010.


Abstract

During the last years support vector machines (SVMs) have been successfully applied even in situations where the input space X; is not necessarily a subset of ℝd. Examples include SVMs using probability measures to analyse e.g. histograms or coloured images, SVMs for text classification and web mining, and SVMs for applications from computational biology using, e.g., kernels for trees and graphs. Moreover, SVMs are known to be consistent to the Bayes risk, if either the input space is a complete separable metric space and the reproducing kernel Hilbert space (RKHS) H ⊂ Lp (PX) is dense, or if the SVM is based on a universal kernel k. So far, however, there are no RKHSs of practical interest known that satisfy these assumptions if X ⊄ ℝd. We close this gap by providing a general technique based on Taylor-type kernels to explicitly construct universal kernels on compact metric spaces which are not subset of ℝd. We apply this technique for the following special cases: universal kernels on the set of probability measures, universal kernels based on Fourier transforms, and universal kernels for signal processing.

Links


BibTeX

@inproceedings{christmann10_neurips, title = {Universal Kernels on Non-Standard Input Spaces}, author = {Christmann, Andreas and Steinwart, Ingo}, year = {2010}, booktitle = {Advances in Neural Information Processing Systems (NeurIPS)}, volume = {23}, pages = {406--414}, url = {https://papers.nips.cc/paper/2010/hash/4e0cb6fb5fb446d1c92ede2ed8780188-Abstract.html} }