PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Learning Locally Minimax Optimal Bayesian Networks
Tomi Silander, Teemu Roos and Petri Myllymäki
International Journal of Approximate Reasoning Volume 51 (2010), Number 5 (June), pp. 544-557, 2010.

Abstract

We consider the problem of learning Bayesian network models in a non-informative set- ting, where the only available information is a set of observational data, and no background knowledge is available. The problem can be divided into two different subtasks: learning the structure of the network (a set of independence relations), and learning the parameters of the model (that fix the probability distribution from the set of all distributions consistent with the chosen structure). There are not many theoretical frameworks that consistently handle both these problems together, the Bayesian framework being an exception. In this paper we propose an alternative, information-theoretic framework which sidesteps some of the technical problems facing the Bayesian approach. The framework is based on the minimax optimal normalized maximum likelihood (NML) distribution, which is motivated by the minimum description length (MDL) principle. The resulting model selection crite- rion is consistent, and it provides a way to construct highly predictive Bayesian network models. Our empirical tests show that the proposed method compares favorably with alternative approaches in both model selection and prediction tasks.

PDF - PASCAL Members only - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:7929
Deposited By:Petri Myllymäki
Deposited On:17 March 2011