PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Variational Information Maximization for Neural Coding
Felix Agakov and David Barber
In: ICONIP 2004, 22-25 Nov 2004, Calcutta, India.

Abstract

Mutual Information (MI) is a long studied measure of coding efficiency, and many attempts to apply it to population coding have been made. However, this is a computationally intractable task, and most previous studies redefine the criterion in forms of approximations. Recently we described properties of a simple lower bound on MI [2]. Here we describe the bound optimization procedure for learning of population codes in a simple point neural model. We compare our approach with other techniques maximizing approximations of MI, focusing on a comparison with the Fisher Information criterion.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Conference or Workshop Item (Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:457
Deposited By:Felix Agakov
Deposited On:28 December 2004