PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

An Auxiliary Variational Method
Felix Agakov and David Barber
(2004) Technical Report. University of Edinburgh, Edinburgh, UK.

Abstract

Variational methods have proved popular and effective for inference and learning in intractable graphical models. An attractive feature of the approaches based on the Kullback-Leibler divergence is a rigorous lower bound on the normalization constants in undirected models. In the suggested work we explore the idea of using auxiliary variables to improve on the lower bound of standard mean field methods. Our approach forms a more powerful class of approximations than any structured mean field technique. Furthermore, the existing lower bounds of the variational mixture models could be seen as computationally expensive special cases of our method. A byproduct of our work is an efficient way to calculate a set of mixture coefficients for any set of tractable distributions that principally improves on a flat combination.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Monograph (Technical Report)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:463
Deposited By:Felix Agakov
Deposited On:23 December 2004