PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Relating the Rademacher and VC Bounds
Matti Kääriäinen
(2004) Technical Report. Department of Computer Science, University of Helsinki, Helsinki, Finland.

Abstract

In this technical report we investigate the relationship between generalization error bounds based on VC dimension and Rademacher penalties. We show that a version of the standard VC bound can be recovered from the Rademacher bound, thus providing a direct proof that Rademacher bounds are always at least as good as VC bounds (modulo a small constant factor). The proof highlights in a transparent way the properties of the learning sample that the Rademacher bound takes advantage of but the VC bound overlooks. This clarifies why and when Rademacher penalization yields better results than VC dimension bounds do. As a byproduct we get a new simple proof of the fact that the conditional expectation of Rademacher penalty can be upper bounded by a function of empirical shatter coefficients. Our empirical experiments show that the Rademacher bound can beat VC bounds even when the distribution generating the learning sample is as bad as can be.

Postscript - Requires a viewer, such as GhostView
EPrint Type:Monograph (Technical Report)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:260
Deposited By:Matti Kääriäinen
Deposited On:23 November 2004