Kolmogorov Complexity and Information Theory \\ {\small \em with an
interpretation in terms of
questions and answers ## AbstractWe compare the elementary theories of Shannon information and Kolmogorov complexity, the extent to which they have a common purpose, and where they are fundamentally different. We discuss and relate the basic notions of both theories: {\em Shannon entropy, Kolmogorov complexity, Shannon mutual information\/} and {\em Kolmogorov (`algorithmic') mutual information}. We explain how {\em universal coding\/} may be viewed as a middle ground between the two theories. We consider Shannon's rate distortion theory, which quantifies {\em useful\/} (in a certain sense) information. We use the communication of information as our guiding motif, and we explain how it relates to sequential question-answer sessions.
[Edit] |