PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Perturbation Corrections in Approximate Inference: Mixture Modelling Applications
Ulrich Paquet, Ole Winther and Manfred Opper
Journal of Machine Learning Research Volume 10, Number Jun, pp. 1263-1304, 2009.

Abstract

Bayesian inference is intractable for many interesting models, making deterministic algorithms for approximate inference highly desirable. Unlike stochastic methods, which are exact in the limit, the accuracy of these approaches cannot be reasonably judged. In this paper we show how low order perturbation corrections to an expectation-consistent (EC) approximation can provide the necessary tools to ameliorate inference accuracy, and to give an indication of the quality of approximation without having to resort to Monte Carlo methods. Further comparisons are given with variational Bayes and parallel tempering (PT) combined with thermodynamic integration on a Gaussian mixture model. To obtain practical results we further generalize PT to temper from arbitrary distributions rather than a prior in Bayesian inference.

EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Theory & Algorithms
ID Code:6814
Deposited By:Ole Winther
Deposited On:08 March 2010