PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

On learning a function of perceptrons
Martin Anthony
In: IJCNN 2004, 26-29 July, Budapest, Hungary.

Abstract

This paper concerns the generalization accuracy when training a classifier that is a fixed Boolean function of the outputs of a number of perceptrons. The analysis involves the `margins' achieved by the constituent perceptrons on the training data. A special case is that in which the fixed Boolean function is the majority function (where we have a `committee of perceptrons'). Recent work of Auer et al. studied the computational properties of such networks (where they were called `parallel perceptrons'), proposed an incremental learning algorithm for them. The The results given here provide further motivation for the use of this learning rule.

EPrint Type:Conference or Workshop Item (Poster)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Theory & Algorithms
ID Code:113
Deposited By:Martin Anthony
Deposited On:22 May 2004