PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

On the capabilities of higher-order neurons: a radial basis function approach
Michael Schmitt
Neural Computation 2004. ISSN 0899-7667

This is the latest version of this eprint.


Higher-order neurons with $k$ monomials in $n$ variables are shown to have Vapnik-Chervonenkis (VC) dimension at least $nk+1$. This result supersedes the previously known lower bound obtained via $k$-term monotone disjunctive normal form (DNF) formulas. Moreover, it implies that the VC dimension of higher-order neurons with $k$ monomials is strictly larger than the VC dimension of $k$-term monotone DNF. The result is achieved by introducing an exponential approach that employs Gaussian radial basis function (RBF) neural networks for obtaining classifications of points in terms of higher-order neurons.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
Postscript - Requires a viewer, such as GhostView
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:245
Deposited By:Michael Schmitt
Deposited On:23 November 2004

Available Versions of this Item