## AbstractProbabilistic models have proven themselves as accurate descriptions of many aspects of human cognition such as decision making, visual perception and concept learning as well as of motor control. The fact that human performance in many cognitive tasks is approximately Bayes optimal raises the intriguing question of how probabilistic computations, such as inference and learning, could be implemented or approximated in cortical networks of spiking neurons. Here we present a possible neural implementation of probabilistic inference and learning which is based on sampling. We propose a neural network model, operating in continuous time, which is shown to exactly sample from an arbitrary member of a certain family of probability distributions, namely Boltzmann machines (BMs). More precisely, the spiking activity of the network (convolved with the EPSP kernel) at every point in time is a sample of the underlying BM. The network consists of spiking neuron models that capture non-trivial dynamical phenomena eg. a refractory mechanism. In simulations we show that the proposed neural networks exhibit realistic temporal activity patterns such as sparse and irregular ﬁring. These ﬁring patterns match considerably better experimental observations than those resulting from alternative sampling algorithms in BMs such as Gibbs or Metropolis-Hastings sampling. From a machine learning perspective, a time-discretized version of the network dynamics exactly implements a MCMC sampling procedure in BMs. In the continuous time limit, we show that the neural network dynamics have the desired BM as stationary distribution using the diﬀerential Chapman-Kolmogorov equation. Furthermore, the model beneﬁts from the properties of learning algorithms for BMs which, in the context of neural networks, can be interpreted as local Hebbian rules for synaptic plasticity. These results show that it is possible to sample from and learn parameters of complex probability distributions with biologically plausible spiking network models operating in continuous time.
[Edit] |