PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Finiteness of Redundancy, Regret, Shtarkov Sums, and Jeffreys Integrals in Exponential Families
Peter Grünwald and Peter Harremoes
In: International Symposium on Information Theory, 28/6 - 3/7 2009, Seoul, South Korea.

Abstract

The normalized maximum likelihood (NML) distribution plays a fundamental role in the MDL approach to statistical inference. It is only defined for statistical families with a finite Shtarkov sum. Here we characterize, for exponential families, when the Shtarkov sum is finite. This turns out to be the case if and only if the minimax redundancy is finite, thus extending the reach of our results beyond the individual sequence setting. In practice, the NML/Shtarkov distribution is often approximated by the Bayesian marginal distribution based on Jeffreys’ prior. One serious problem is that in many cases Jeffreys’ prior cannot be normalized. It has been conjectured that Jeffreys’ prior cannot be normalized in exactly the cases where the Shtarkov sum is infinite, i.e. when the minimax redundancy and regret are infinite. We show that the conjecture is true for a large class of exponential families but that there exist examples where the conjecture is violated.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Conference or Workshop Item (Talk)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
ID Code:5057
Deposited By:Peter Harremoes
Deposited On:24 March 2009