PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Introduction to the special issue on echo state networks and liquid state machines.
Herbert Jäger, Wolfgang Maass and Jose Principe
Neural Networks Volume 20, Number 3, pp. 287-289, 2007.

Abstract

Seeking plausible models for brain computation has been a continuing effort in neuroscience, computer science, biophysics and machine learning. Quite generally speaking, there are two routes toward understanding brains. In a bottom-up way, one can attempt to re-create artificial brain structures from empirical observations, whose emerging dynamics are then studied by simulation, searching for dynamical patterns that can be understood in terms of information processing. Conversely, one can proceed top-down, starting from computational metaphors taken from computer science or signal processing and control engineering, and try to synthesize artificial brain modules from these principles whilst relating them to what is known about biological brains. Whatever route one takes, at some point one has to introduce a computational mechanism, some mathematical information processing principle. A large number of such principles have been considered – we just mention logical calculi, Turing computation, "cybernetic" regulation mechanisms, energy-minimizing and particle dynamics motivated by information theory and statistical physics, field theories and pattern-forming nonlinear PDEs, or chaotic attractor dynamics. Originating from outside neuroscience, these information processing mechanisms often are difficult to connect to known neural processing mechanisms. There is however a subset of such principles of very elementary nature whose main learning and activation phenomena can readily be mapped to biology – and indeed have been partially motivated by natural neural systems. We would count in this category the perceptron (trained with the Widrow-Hoff rule), Hopfield networks, and self-organizing maps. The elementary nature of these models makes them amenable to mathematical analysis and invites mapping them to biological brains in many ways. We view echo state networks and liquid state machines, the heroes of this special issue, as a further member in this family of versatile basic computational metaphors with a clear biological footing.

PDF - PASCAL Members only - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:3456
Deposited By:Wolfgang Maass
Deposited On:11 February 2008