PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Rényi Divergence and Majorization
Tim Erven, van and Peter Harremoes
IEEE International Symposium on Information Theory (ISIT) pp. 1335-1339, 2010.

Abstract

Rényi divergence is related to Rényi entropy much like information divergence (also called Kullback-Leibler divergence or relative entropy) is related to Shannon’s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as information divergence. We review the most important properties of Rényi divergence, including its relation to some other distances. We show how Rényi divergence appears when the theory of majorization is generalized from the finite to the continuous setting. Finally, Rényi divergence plays a role in analyzing the number of binary questions required to guess the values of a sequence of random variables.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
ID Code:7242
Deposited By:Tim Erven, van
Deposited On:14 March 2011