PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

A spiking neuron as information bottleneck
Lars Büsing and Wolfgang Maass
Neural Computation 2009.

Abstract

Neurons receive thousands of presynaptic input spike trains while emitting a single output spike train. This drastic dimensionality reduction suggests to consider a neuron as a bottleneck for information transmission. Extending recent results, we propose a simple learning rule for the weights of spiking neurons derived from the Information Bottleneck (IB) framework that minimizes the loss of relevant information transmitted in the output spike train. In the IB framework relevance of information is defined with respect to contextual information, the latter entering the proposed learning rule as a “third” factor besides pre- and postsynaptic activities. This renders the theoretically motivated learning rule a plausible model for experimentally observed synaptic plasticity phenomena involving three factors. Furthermore, we show that the proposed IB learning rule allows spiking neurons to learn a “predictive code”, i.e. to extract those parts of their input that are predictive for future input.

EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:6085
Deposited By:Michael Pfeiffer
Deposited On:08 March 2010