PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Prior Support Vector Machines: minimum-bound vs. maximum-margin classifiers
Amiran Ambroladze, Emilio Parrado-Hernandez and John Shawe-Taylor
(2005) Technical Report. Unpublished, Southampton, UK.

Abstract

In this paper we introduce a new algorithm to train Support Vector Machines that aims at the minimisation of the PAC-Bayes bound on the error instead of at the traditional maximisation of the margin. The training of the classifier proceeds in two stages. First some data are used to estimate a prior distribution of classifiers. Then, an optimisation procedure based on quadratic programming determines the classifier as the centre of the posterior distribution that minimises the PAC-Bayes according to the previously obtained prior. The computational burden of the new algorithm is comparable to that of a standard SVM training including an N-fold cross validation based model selection. In this sense, the PAC-Bayes bound itself can be used to perform the model selection. The experimental work shows that this new algorithm achieves classifiers with tighter PAC-Bayes bound that the original SVM and with sometimes better generalisation capabilities.

EPrint Type:Monograph (Technical Report)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:1761
Deposited By:Emilio Parrado-Hernandez
Deposited On:24 March 2009