On Robustness Properties of Convex Risk Minimization Methods for Pattern Recognition
Andreas Christmann, Ingo Steinwart
Journal of Machine Learning Research (JMLR), 5, pp. 1007–1034, 2004.
Abstract
The paper brings together methods from two disciplines: machine learning theory and robust statistics. We argue that robustness is an important aspect and we show that many existing machine learning methods based on the convex risk minimization principle have - besides other good properties - also the advantage of being robust. Robustness properties of machine learning methods based on convex risk minimization are investigated for the problem of pattern recognition. Assumptions are given for the existence of the influence function of the classifiers and for bounds on the influence function. Kernel logistic regression, support vector machines, least squares and the AdaBoost loss function are treated as special cases. Some results on the robustness of such methods are also obtained for the sensitivity curve and the maxbias, which are two other robustness criteria. A sensitivity analysis of the support vector machine is given.Links
BibTeX
@article{christmann04_jmlr,
title = {On Robustness Properties of Convex Risk Minimization Methods for Pattern Recognition},
author = {Christmann, Andreas and Steinwart, Ingo},
year = {2004},
journal = {Journal of Machine Learning Research (JMLR)},
volume = {5},
pages = {1007--1034},
doi = {10.5555/1005332.1016792},
url = {http://www.jmlr.org/papers/volume5/christmann04a/christmann04a.pdf}
}