Bayesian networks : a better than frequentist approach for parametrization, and a more accurate structural complexity measure than the number of parameters
Sylvain Gelly and Olivier Teytaud
The problem of calibrating relations from examples is a classical problem in learning theory. This problem has in particular been studied in the theory of empirical processes (providing asymptotic results), and through statistical learning theory. The application of learning theory to bayesian networks is still incomplete and we propose a contribution, especially through the use of covering numbers. We deduce multiple corollaries, among which a non-frequentist approach for parameters learning and a score taking into account a measure of structural entropy that has never been taken into account before. We then investigate the algorithmic aspects of our theoretical solution, based on BFGS and adaptive reﬁning of gradient calculus. Empirical results show the relevance of both the statistical results and the algorithmic solution.