PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Hybrid Variational/Gibbs Inference in Topic Models
Max Welling, Yee Whye Teh and Bert Kappen
In: UAI 2008, 10-12 Jul 2008, Helsinki, Finland.

Abstract

Variational Bayesian inference and (collapsed) Gibbs sampling are the two important classes of inference algorithms for Bayesian networks. Both have their advantages and disadvantages: collapsed Gibbs sampling is unbiased but is also inefficient for large count values and requires averaging over many samples to reduce variance. On the other hand, variational Bayesian inference is efficient and accurate for large count values but suffers from bias for small counts. We propose a hybrid algorithm that combines the best of both worlds: it samples very small counts and applies variational updates to large counts. This hybridization is shown to significantly improve test-set perplexity relative to variational inference at no computational cost.

EPrint Type:Conference or Workshop Item (Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
Theory & Algorithms
Information Retrieval & Textual Information Access
ID Code:4696
Deposited By:Yee Whye Teh
Deposited On:24 March 2009