PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Information Consistency of Nonparametric Gaussian Process Methods
matthias seeger, Sham Kakade and Dean P. Foster
IEEE Transactions on Information Theory 2008.

Abstract

Bayesian nonparametric models are widely and successfully used for statistical prediction. While posterior consistency properties are well studied \cite{Diaconis:86,Clarke:90,Barron:98a,Barron:99,Shen:01} in quite general settings, results have been proved using abstract concepts such as metric entropy, and they come with subtle conditions which are hard to validate and not intuitive when applied to concrete models. Furthermore, convergence rates are difficult to obtain. By focussing on the concept of information consistency \cite{Barron:98a} for Bayesian Gaussian process models, we obtain consistency results and convergence rates via a regret bound on cumulative log loss. Our result depends strongly on the covariance function of the prior process, thereby giving a novel interpretation to penalization with reproducing kernel Hilbert space norms and to commonly used covariance function classes and their parameters. The proof of our main result requires elementary convexity arguments only. We use a theorem of Widom~\cite{Widom:63} in order to obtain precise convergence rates for several covariance functions widely used in practice.

EPrint Type:Article
Additional Information:Available at http://www.kyb.tuebingen.mpg.de/bs/people/seeger/
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Theory & Algorithms
ID Code:2700
Deposited By:matthias seeger
Deposited On:19 December 2007