PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Self-organizing mixture models
Jakob Verbeek, Nikos Vlassis and Ben Krose
Neurocomputing Volume 63, 2005.

Abstract

We present an expectation-maximization (EM) algorithm that yields topology preserving maps of data based on probabilistic mixture models. Our approach is applicable to any mixture model for which we have a normal EM algorithm. Compared to other mixture model approaches to self-organizing maps (SOMs), the function our algorithm maximizes has a clear interpretation: it sums data log-likelihood and a penalty term that enforces self-organization. Our approach allows principled handling of missing data and learning of mixtures of SOMs. We present example applications illustrating our approach for continuous, discrete, and mixed discrete and continuous data.

EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
ID Code:4223
Deposited By:Jakob Verbeek
Deposited On:05 December 2008