PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Inner product spaces for Bayesian networks
Atsuyoshi Nakamura, Michael Schmitt, Niels Schmitt and Hans Ulrich Simon
Journal of Machine Learning Research 2004. ISSN 1532-4435


We consider the idea of combining the key advantages of Bayesian networks and of kernel-based learning systems. In connection with two-label classification tasks over the Boolean domain, we study the question whether the class of decision functions induced by a given Bayesian network can be represented within a low-dimensional inner product space. For Bayesian networks with an explicitly given (full or reduced) parameter collection, we establish tight bounds on the dimension of the ``natural'' inner product space. Further, we consider a variant of the logistic autoregressive Bayesian network and show that every sufficiently expressive inner product space must have dimension at least $2^{\Omega(n)}$, where $n$ is the number of network nodes. As the main technical contribution, this work reveals combinatorial and algebraic structures within Bayesian networks such that known techniques for proving lower bounds on the dimension of inner product spaces can be brought into play.

PDF - PASCAL Members only - Requires Adobe Acrobat Reader or other PDF viewer.
Postscript - PASCAL Members only - Requires a viewer, such as GhostView
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:268
Deposited By:Michael Schmitt
Deposited On:23 November 2004