PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

On the classification capability of sign-constrained perceptrons.
Robert Legenstein and Wolfgang Maass
Neural Computation 2007.

Abstract

The perceptron (also referred to as Mc Culloch-Pitts neuron, or linear threshold gate) is commonly used as a simplified model for the discrimination and learning capability of a biological neuron. Criteria that tell us how many dichotomies a perceptron can implement (or learn to implement) over a given set of input patterns are well-known, but only for the idealized case where one assumes that the sign of a synaptic weight can be switched during learning. We present in this article an analysis of the learning capability of the biologically more realistic model of a sign-constrained perceptron, where the signs of synaptic weights remain fixed during learning (i.e.,obey Dale's law). In particular, the VC-dimension of sign-constrained perceptrons is determined, and a necessary and sufficient criterion is provided that tells us when all 2^m dichotomies over a given set of m patterns can be learned by a sign-constrained perceptron. We also exhibit cases where the sign-constraint of a perceptron drastically reduces its classification capability.

EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:2600
Deposited By:Wolfgang Maass
Deposited On:22 November 2006