PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Risk bounds for statistical learning
Pascal Massart and Elodie Nédélec
annals of statistics Volume 34, Number 5, 2006. ISSN 0090-5364


We propose a general theorem providing upper bounds for the risk of an empirical risk minimizer (ERM).We essentially focus on the binary classification framework. We extend Tsybakov's analysis of the risk of an ERM under margin type conditions by using concentration inequalities for conveniently weighted empirical processes. This allows us to deal with other ways of measuring the "size" of a class of classifiers than entropy with bracketing as in Tsybakov's work. In particular we derive new risk bounds for the ERM when the classification rules belong to some VC-class under margin conditions and discuss the optimality of those bounds in a minimax sense.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
ID Code:1744
Deposited By:Pascal Massart
Deposited On:22 November 2006