PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Model selection by bootstrap penalization for classification
Magalie Fromont
Lecture Notes in Artificial Intelligence -17th Annual Conference on Learning Theory , COLT 2004, Proceedings Volume 3120, pp. 285-299, 2004. ISSN 0302-9743

Abstract

We consider the binary classification problem. Given an i.i.d. sample drawn from the distribution of an X*{0,1}-valued random pair, we propose to estimate the so-called Bayes classifier by minimizing the sum of the empirical classification error and a penalty term based on Efron's or i.i.d. weighted bootstrap samples of the data. We obtain exponential inequalities for such bootstrap type penalties, which allow us to derive non-asymptotic properties for the corresponding estimators. In particular, we prove that these estimators achieve the global minimax risk over sets of functions built from Vapnik-Chervonenkis classes. The obtained results generalize Koltchinskii (2001) and Bartlett, Boucheron, Lugosi's (2002) ones for Rademacher penalties that can thus be seen as special examples of bootstrap type penalties.

Postscript - PASCAL Members only - Requires a viewer, such as GhostView
PDF - PASCAL Members only - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
ID Code:777
Deposited By:Magalie Fromont
Deposited On:30 December 2004