PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Empirical Bernstein Stopping
Vladimir Mnih, Csaba Szepesvari and Jean-Yves Audibert
In: ICML 2008, 5-9 July 2008, Helsinki, Finland.

Abstract

Sampling is a popular way of scaling up machine learning algorithms to large datasets. The question often is how many samples are needed. Adaptive stopping algorithms monitor the performance in an online fashion and they can stop early, saving valuable resources. We consider problems where probabilistic guarantees are desired and demonstrate how recently-introduced empirical Bernstein bounds can be used to design stopping rules that are efficient. We provide upper bounds on the sample complexity of the new rules, as well as empirical results on model selection and boosting in the filtering setting.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Conference or Workshop Item (Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:4933
Deposited By:Csaba Szepesvari
Deposited On:24 March 2009