PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Consistency and convergence rates of one-class SVM and related algorithms
Regis Vert and Jean-Philippe Vert
In: NIPS 2005, Dec 5-8, 2005, Vancouver, Canada.

Abstract

We determine the asymptotic limit of the function computed by support vector machines (SVM) and related algorithms that minimize a regularized empirical convex loss function in the reproducing kernel Hilbert space of the Gaussian RBF kernel, in the situation where the number of examples tends to infinity, the bandwidth of the Gaussian kernel tends to, and the regularization parameter is held fixed. Non-asymptotic convergence bounds to this limit in the sense are provided, together with upper bounds on the classification error that is shown to converge to the Bayes risk, therefore proving the Bayes-consistency of a variety of methods although the regularization term does not vanish. These results are particularly relevant to the one-class SVM, for which the regularization can not vanish by construction, and which is shown for the first time to be a consistent density level set estimator.

EPrint Type:Conference or Workshop Item (Oral)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:1397
Deposited By:Jean-Philippe Vert
Deposited On:28 November 2005