PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Fast learning rates for plug-in classifiers under the margin condition
Jean-Yves Audibert
In: Empirical Processes and Asymptotic Statistics 2007, Jun 2007, Rennes, France.


It has been recently shown that, under the margin (or low noise) assumption, there exist classifiers attaining fast rates of convergence of the excess Bayes risk, i.e., the rates faster than n^{-1/2}. The works on this subject suggested the following two conjectures: (i) the best achievable fast rate is of the order n^{-1}, and (ii) the plug-in classifiers generally converge slower than the classifiers based on empirical risk minimization. We show that both conjectures are not correct. In particular, we construct plug-in classifiers that can achieve not only the fast, but also the super-fast rates, i.e., the rates faster than n^{-1}. We establish minimax lower bounds showing that the obtained rates cannot be improved.

EPrint Type:Conference or Workshop Item (Invited Talk)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:3158
Deposited By:Jean-Yves Audibert
Deposited On:30 December 2007