PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Concave Gaussian Variational Approximations for Inference in Large-Scale Bayesian Linear Models
Edward Challis and David Barber
In: AISTATS 2011, 11-13 April 2011, Fort Lauderdale.


Two popular approaches to forming principled bounds in approximate Bayesian inference are local variational methods and minimal Kullback-Leibler divergence methods. For a large class of models, we explicitly relate the two approaches, showing that the local variational method is equivalent to a weakened form of Kullback-Leibler Gaussian approximation. This gives a strong motivation to develop ecient methods for KL minimisation. An important and previously unproven property of the KL variational Gaussian bound is that it is a concave function in the parameters of the Gaussian for log concave sites. This observation, along with compact concave parameterisations of the covariance, enables us to develop fast scalable optimisation procedures to obtain lower bounds on the marginal likelihood in large scale Bayesian linear models.

EPrint Type:Conference or Workshop Item (Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:7915
Deposited By:David Barber
Deposited On:17 March 2011