PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Learning locally minimax optimal Bayesian networks
Tomi Silander, Teemu Roos and Petri Myllymäki
International Journal of Approximate Reasoning Volume 51, Number 5, pp. 544-557, 2010. ISSN 0888-613X

Abstract

We consider the problem of learning Bayesian network models in a non-informative setting, where the only available information is a set of observational data, and no background knowledge is available. The problem can be divided into two different subtasks: learning the structure of the network (a set of independence relations), and learning the parameters of the model (that fix the probability distribution from the set of all distributions consistent with the chosen structure). There are not many theoretical frameworks that consistently handle both these problems together, the Bayesian framework being an exception. In this paper we propose an alternative, information-theoretic framework which sidesteps some of the technical problems facing the Bayesian approach. The framework is based on the minimax optimal normalized maximum likelihood (NML) distribution, which is motivated by the minimum description length (MDL) principle. The resulting model selection criterion is consistent, and it provides a way to construct highly predictive Bayesian network models. Our empirical tests show that the proposed method compares favorably with alternative approaches in both model selection and prediction tasks.

PDF - PASCAL Members only - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Theory & Algorithms
ID Code:7284
Deposited By:Teemu Roos
Deposited On:16 March 2011