PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Prior Support Vector Machines: minimum-bound vs. maximum-margin classifiers
Amiran Ambroladze, Emilio Parrado-Hernandez and John Shawe-Taylor
(2005) Technical Report. Unpublished, Southampton, UK.

Abstract

In this paper we introduce a new algorithm to train Support Vector Machines that aims at the minimisation of the PAC-Bayes bound on the error instead of at the traditional maximisation of the margin. The training of the classifier proceeds in two stages. First some data are used to estimate a prior distribution of classifiers. Then, an optimisation procedure based on quadratic programming determines the classifier as the centre of the posterior distribution that minimises the PAC-Bayes according to the previously obtained prior. The computational burden of the new algorithm is comparable to that of a standard SVM training including an N-fold cross validation based model selection. In this sense, the PAC-Bayes bound itself can be used to perform the model selection. The experimental work show that this new algorithm achieves classifiers with tighter PAC-Bayes bound that the original SVM and with sometimes better generalisation capabilities.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Monograph (Technical Report)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:1782
Deposited By:Emilio Parrado-Hernandez
Deposited On:28 November 2005