PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Kolmogorov Complexity and Information Theory \\ {\small \em with an interpretation in terms of questions and answers
Peter D. Grunwald and Paul M.B. Vitanyi
J. Logic, Language, and Information Volume 12, Number 4, pp. 497-529, 2003.

Abstract

We compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and where they are fundamentally different. We discuss and relate the basic notions of both theories: {\em Shannon entropy, Kolmogorov complexity, Shannon mutual information\/} and {\em Kolmogorov (`algorithmic') mutual information}. We explain how {\em universal coding\/} may be viewed as a middle ground between the two theories. We consider Shannon's rate distortion theory, which quantifies {\em useful\/} (in a certain sense) information. We use the communication of information as our guiding motif, and we explain how it relates to sequential question-answer sessions.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
Postscript - Requires a viewer, such as GhostView
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Information Retrieval & Textual Information Access
ID Code:813
Deposited By:Paul Vitányi
Deposited On:01 January 2005