PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Variational Information Maximization in Gaussian Channels
Felix Agakov and David Barber
(2004) Technical Report. University of Edinburgh, Edinburgh, UK.

Abstract

Recently, we introduced a simple variational bound on mutual information, that resolves some of the difficulties in the application of information theory to machine learning. Here we study a specific application to Gaussian channels. It is well known that PCA may be viewed as the solution to maximizing information transmission between a high dimensional vector x and its low dimensional representation y. However, such results are based on assumptions of Gaussianity of the sources x. Here we show how our mutual information bound, when applied to this arena, gives PCA solutions, without the need for the Gaussian assumption. Furthermore, it naturally generalizes to providing an objective function for Kernel PCA, enabling the principled selection of kernel parameters.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Monograph (Technical Report)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:462
Deposited By:Felix Agakov
Deposited On:23 December 2004