PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Generalised Pinsker Inequalities
Mark Reid and Bob Williamson
In: Conference on Learning Theory 2009, 18-21 June 2009, Montreal.

Abstract

We generalise the classical Pinsker inequality which relates variational divergence to Kullback- Liebler divergence in two ways: we consider arbitrary f-divergences in place of KL divergence, and we assume knowledge of a sequence of values of generalised variational divergences. We then develop a best possible inequality for this doubly generalised situation. Specialising our result to the classical case provides a new and tight explicit bound relating KL to variational divergence (solving a problem posed by Vajda some 40 years ago). The solution relies on exploiting a connection between divergences and the Bayes risk of a learning problem via an integral representation.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Conference or Workshop Item (Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
ID Code:8978
Deposited By:Bob Williamson
Deposited On:21 February 2012