PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Entropy concentration and the empirical coding game
Peter Grünwald
Statistica Neerlandica Volume 62, Number 3, pp. 374-392, 2008.

Abstract

We give a characterization of Maximum Entropy/Minimum Relative Entropy inference by providing two `strong entropy concentration' theorems. These theorems unify and generalize Jaynes' `concentration phenomenon' and Van Campenhout and Cover's`conditional limit theorem'. The theorems characterize exactly in what sense a prior distribution Q conditioned on a given constraint and the distribution \tilde{P} minimizing D(P || Q) over all P satisfying the constraint are `close' to each other. We then apply our theorems to establish the relationship between entropy concentration and a game-theoretic characterization of Maximum Entropy Inference due to Topsoe and others.

PDF (The document is based on an earlier technical report and somewhat different from the published version) - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Theory & Algorithms
ID Code:4591
Deposited By:Peter Grünwald
Deposited On:13 March 2009