Prior Support Vector Machines: minimum-bound
vs. maximum-margin classifiers
In this paper we introduce a new algorithm to train Support Vector Machines that aims at the minimisation of the PAC-Bayes bound on the error instead of at the traditional maximisation of the margin. The training of the classiﬁer proceeds in two stages. First some data are used to estimate a prior distribution of classiﬁers. Then, an optimisation procedure based on quadratic programming determines the classiﬁer as the centre of the posterior distribution that minimises the PAC-Bayes according to the previously obtained prior. The computational burden of the new algorithm is comparable to that of a standard SVM training including an N-fold cross validation based model selection. In this sense, the PAC-Bayes bound itself can be used to perform the model selection. The experimental work shows that this new algorithm achieves classiﬁers with tighter PAC-Bayes bound that the original SVM and with sometimes better generalisation capabilities.