PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Combining PAC-Bayesian and generic chaining bounds
Jean-Yves Audibert and Olivier Bousquet
Journal of Machine Learning Research Volume 8, pp. 863-889, 2007. ISSN 1533-7928

Abstract

There exist many different generalization error bounds in statistical learning theory. Each of these bounds contains an improvement over the others for certain situations or algorithms. Our goal is, first, to underline the links between these bounds, and second, to combine the different improvements into a single bound. In particular we combine the PAC-Bayes approach introduced by McAllester (1998), which is interesting for randomized predictions, with the optimal union bound provided by the generic chaining technique developed by Fernique and Talagrand (see Talagrand, 1996), in a way that also takes into account the variance of the combined functions. We also show how this connects to Rademacher based bounds.

EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:3150
Deposited By:Jean-Yves Audibert
Deposited On:29 December 2007