PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Empirical risk minimization for support vector classifiers
Fernando Perez-Cruz, Angel Navia-Vazquez, Anibal R. Figueiras-Vidal and Antonio Artes-Rodriguez
IEEE Transactions on Neural Networks Volume 14, Number 2, pp. 296-303, 2003. ISSN 1045-9227

Abstract

In this paper, we propose a general technique for solving support vector classifiers (SVCs) for an arbitrary loss function, relying on the application of an iterative reweighted least squares (IRWLS) procedure. We further show that three properties of the SVC solution can be written as conditions over the loss function. This technique allows the implementation of the empirical risk minimization (ERM) inductive principle on large margin classifiers obtaining, at the same time, very compact (in terms of number of support vectors) solutions. The improvements obtained by changing the SVC loss function are illustrated with synthetic and real data examples.

EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:522
Deposited By:Fernando Perez-Cruz
Deposited On:24 December 2004