PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Risk bounds for statistical learning
Pascal Massart and E NEDELEC
The Annals of Statistics pp. 45-45, 2003.

Abstract

We propose a general theorem providing upper bounds for the risk of an empirical risk minimizer (ERM).We essentially focus on the binary classification framework. We extend Tsybakov's analysis of the risk of an ERM under margin type conditions by using concentration inequalities for conveniently weighted empirical processes. This allows us to deal with other ways of measuring the size of a class of classifiers than entropy with bracketing as in Tsybakov's work. In particular we derive new risk bounds for the ERM when the classification rules belong to some VC-class under margin conditions and discuss the optimality of those bounds in a minimax sense.

EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
ID Code:629
Deposited By:Michele Sebag
Deposited On:29 December 2004