Analysis of Neural Data and Models of Neural Networks related to Working Memory
The ability to store information and to retain this information to a later point in time is probably one of the most essential functions of one of the most fascinating systems evolution has developed - the brain. Working memory is used to maintain information for relatively brief time intervals, usually on the time scale of up to several seconds. This thesis investigates several aspects of working memory and provides new insights to the formation of memory traces in the brain. Two approaches to this topic are covered by the present work. The first approach studies the phenomenon of working memory formation from the perspective of electrophysiological data analysis and signal processing, while in the second approach working memory traces and memory-dependent computations are generated through reward modulated learning in a computational model of a recurrent neural network. The first part, comprised of two chapters, reveals local interaction patterns between neural populations within extrastriate visual area V4 as well as long-range interactions between two distant cortical areas, V4 and the lateral prefrontal cortex (lPF) of macaque monkeys that perform a visual short-term memory task. The analysis is based on simultaneous recordings of local field potentials (LFP) and spiking activity of single units obtained from extracellular recordings in the awake and behaving monkey, provided by my collaborators from the MPI for Biological Cybernetics in T¨ubingen. Within V4, analysis with multivariate autoregressive models reveals new insights into the patterns of directed information flow between neural populations on the level of the local field potential which are most prominent in the theta frequency band, and moreover shows that these interaction patterns are a rather local phenomenon. Between V4 and lPF, results from Wavelet-based methods for phase synchronization analysis suggest that the synchronization of oscillatory activity in the theta range between these distant cortical sites is likely to provide the basis for the coordination of spiking activity in both areas during the memory phase of the task. The second part tries to extend and modify previous results from the field of reservoir computing and provides experimental evidence for the ability of a rate-based recurrent neural network with trained readout units to learn to produce coherent patterns of activity, memory traces and to carry out memory-dependent computations by employing a purely local reward-modulated Hebbian learning rule. In contrast to the traditionally used fully supervised methods, learning in the proposed model is solely based on correlations between the presynaptic activity and postsynaptic noise perturbations, modulated by a global binary signal that provides the system with information whether the overall system performance has recently increased. In this way, the present results provide a new rerspective for the emergence of complex computations through learning in biological neural systems.