PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Bayesian Networks and Inner Product Spaces
Atsuyoshi Nakamura, Michael Schmitt, Niels Schmitt and Hans Simon
Proceedings of COLT 2004 pp. 518-533, 2004.

Abstract

In connection with two-label classification tasks over the Boolean domain, we consider the possibility to combine the key advantages of Bayesian networks and of kernel-based learning systems. This leads us to the basic question whether the class of decision functions induced by a given Bayesian network can be represented within a low-dimensional inner product space. For Bayesian networks with an explicitly given (full or reduced) parameter collection, we show that the ``natural'' inner product space has the smallest possible dimension up to factor $2$ (even up to an additive term $1$ in many cases). For a slight modification of the so-called logistic autoregressive Bayesian network with $n$ nodes, we show that every sufficiently expressive inner product space has dimension at least $2^{n/4}$. The main technical contribution of our work consists in uncovering combinatorial and algebraic structures within Bayesian networks such that known techniques for proving lower bounds on the dimension of inner product spaces can be brought into play.

Postscript - Requires a viewer, such as GhostView
EPrint Type:Article
Additional Information:A full version of this extended abstract is found in JMLR 6: 1383-1403, 2005.
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:56
Deposited By:Hans Simon
Deposited On:14 May 2004