Consistency of One-Class SVM and Related Algorithms
Regis Vert and Jean-Philippe Vert
In: NIPS 2005, 5-10 Dec 2005, Vancouver, Canada.
We determine the asymptotic limit of the function computed by support
vector machines (SVM) and related algorithms that minimize a regularized
empirical convex loss function in the reproducing kernel Hilbert
space of the Gaussian RBF kernel, in the situation where the number of
examples tends to infinity, the bandwidth of the Gaussian kernel tends to
0, and the regularization parameter is held fixed. Non-asymptotic convergence
bounds to this limit in the L2 sense are provided, together with
upper bounds on the classification error that is shown to converge to the
Bayes risk, therefore proving the Bayes-consistency of a variety of methods
although the regularization term does not vanish. These results are
particularly relevant to the one-class SVM, for which the regularization
can not vanish by construction, and which is shown for the first time to
be a consistent density level set estimator.