n the Complexity of Good Samples for Learning
Joel Ratsaby
Proc. Tenth International Computing and Combinatorics Conference(COCOON 2004), Jeju Island, Korea Volume Vol. 3106 of Lecture Notes in Computer Science, Springer-Verlag, pp. 198-209, 2004.

## Abstract

In machine-learning, maximizing the sample margin can reduce the learning generalization-error. Thus samples on which the target function has a large margin ($\gamma$) convey more information so we expect fewer such samples. In this paper, we estimate the complexity of a class of sets of large-margin samples for a general learning problem over a finite domain. We obtain an explicit dependence of this complexity on $\gamma$ and the sample size.

 PDF - PASCAL Members only - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type: Article Future updated versions of this work are available at http:/www.bgu.ac.il/~ratsaby/Publications.htm Project Keyword UNSPECIFIED Computational, Information-Theoretic Learning with StatisticsLearning/Statistics & Optimisation 808 Joel Ratsaby 30 December 2004