PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

A Quadratic Loss Multi-Class SVM for which a Radius-Margin Bound Applies
Yann Guermeur and E. Monfrini
INFORMATICA Volume 22, Number 1, pp. 1-25, 2011.

Abstract

Using a support vector machine (SVM) requires to set the values of two types of hyperparameters: the soft margin parameter C and the parameters of the kernel. To perform this model selection task, the method of choice is cross-validation. Its leave-one-out variant is known to produce an estimator of the generalization error which is almost unbiased. Its major drawback rests in its requirements in terms of computational time. To overcome this difficulty, several upper bounds on the leave-one-out error of the pattern recognition SVM have been derived. Among those bounds, the most popular one is probably the radius-margin bound. It applies to the hard margin machine, and, by extension, to the 2-norm SVM. In this article, we introduce a variant of the multi-class SVM of Lee, Lin and Wahba: the M-SVM^2. This quadratic loss machine can be seen as a direct extension of the 2-norm SVM to the multi-class case. For this machine, a generalized radius-margin bound is then established.

PDF - PASCAL Members only - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
ID Code:6621
Deposited By:Yann Guermeur
Deposited On:08 March 2010