PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

A spiking neural network implementation of MCMC sampling
Lars Büsing, Johannes Bill, Bernhard Nessler and Wolfgang Maass
In: Cosyne 2011, 24 Jan - 01 mar 2011, Salt Lake City, Utah.

Abstract

Probabilistic models have proven themselves as accurate descriptions of many aspects of human cognition such as decision making, visual perception and concept learning as well as of motor control. The fact that human performance in many cognitive tasks is approximately Bayes optimal raises the intriguing question of how probabilistic computations, such as inference and learning, could be implemented or approximated in cortical networks of spiking neurons. Here we present a possible neural implementation of probabilistic inference and learning which is based on sampling. We propose a neural network model, operating in continuous time, which is shown to exactly sample from an arbitrary member of a certain family of probability distributions, namely Boltzmann machines (BMs). More precisely, the spiking activity of the network (convolved with the EPSP kernel) at every point in time is a sample of the underlying BM. The network consists of spiking neuron models that capture non-trivial dynamical phenomena eg. a refractory mechanism. In simulations we show that the proposed neural networks exhibit realistic temporal activity patterns such as sparse and irregular firing. These firing patterns match considerably better experimental observations than those resulting from alternative sampling algorithms in BMs such as Gibbs or Metropolis-Hastings sampling. From a machine learning perspective, a time-discretized version of the network dynamics exactly implements a MCMC sampling procedure in BMs. In the continuous time limit, we show that the neural network dynamics have the desired BM as stationary distribution using the differential Chapman-Kolmogorov equation. Furthermore, the model benefits from the properties of learning algorithms for BMs which, in the context of neural networks, can be interpreted as local Hebbian rules for synaptic plasticity. These results show that it is possible to sample from and learn parameters of complex probability distributions with biologically plausible spiking network models operating in continuous time.

EPrint Type:Conference or Workshop Item (Spotlight)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Theory & Algorithms
ID Code:7561
Deposited By:Bernhard Nessler
Deposited On:17 March 2011