PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

An improved VC dimension bound for sparse polynomials
Michael Schmitt
In: 17th Annual Conference on Learning Theory COLT '04, 01-04 July 2004, Banff, Canada.

There is a more recent version of this eprint available. Click here to view it.


We show that the function class consisting of $k$-sparse polynomials in $n$ variables has Vapnik-Chervonenkis (VC) dimension at least $nk+1$. This result supersedes the previously known lower bound via $k$-term monotone disjunctive normal form (DNF) formulas obtained by Littlestone (1988). Moreover, it implies that the VC dimension for $k$-sparse polynomials is strictly larger than the VC dimension for $k$-term monotone DNF. The new bound is achieved by introducing an exponential approach that employs Gaussian radial basis function (RBF) neural networks for obtaining classifications of points in terms of sparse polynomials.

EPrint Type:Conference or Workshop Item (Talk)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:93
Deposited By:Michael Schmitt
Deposited On:18 May 2004

Available Versions of this Item