PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Variational Approximations in Bayesian Model Selection for Finite Mixture Distributions
Clare Anne McGrory and Mike Titterington
Computational Statistics and Data Analysis 2005.

Abstract

Variational methods, which have become popular in the neural computing/machine learning literature, are applied to the Bayesian analysis of mixtures of Gaussian distributions. It is also shown how the Deviance Information Criterion, DIC, can be extended to these types of model by exploiting the use of variational approximations. The use of variational methods for model selection and the calculation of a DIC are illustrated with real and simulated data. Using the variational approximation, one can simultaneously estimate component parameters and the model complexity. It turns out that, if one starts off with a large number of components, superfluous components are eliminated as the method converges to a solution, thereby leading to an automatic choice of model complexity, the appropriateness of which is reflected in the DIC values.

PDF - PASCAL Members only - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:2176
Deposited By:Mike Titterington
Deposited On:17 August 2006