On the classification capability of sign-constrained perceptrons.
The perceptron (also referred to as Mc Culloch-Pitts neuron, or linear threshold gate) is commonly used as a simplified model for the discrimination and learning capability of a biological neuron. Criteria that tell us how many dichotomies a perceptron can implement (or learn to implement) over a given set of input patterns are well-known, but only for the idealized case where one assumes that the sign of a synaptic weight can be switched during learning. We present in this article an analysis of the learning capability of the biologically more realistic model of a sign-constrained perceptron, where the signs of synaptic weights remain fixed during learning (i.e.,obey Dale's law). In particular, the VC-dimension of sign-constrained perceptrons is determined, and a necessary and sufficient criterion is provided that tells us when all 2^m dichotomies over a given set of m patterns can be learned by a sign-constrained perceptron. We also exhibit cases where the sign-constraint of a perceptron drastically reduces its classification capability.