PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Statistical Properties of Kernel Principal Component Analysis
Gilles Blanchard, Olivier Bousquet and Laurent Zwald
Machine Learning Volume 66, Number 2-3, pp. 259-294, 2007.

Abstract

The main goal of this paper is to prove inequalities on the reconstruction error for Kernel Principal Component Analysis. With respect to previous work on this topic, our contribution is two-fold: (1) we give bounds that explicitly take into account the empirical centering step in this algorithm, and (2) we show that a ``localized'' approach allows to more accurate bounds. In particular, we show faster rates of convergence towards the minimum reconstruction error, more precisely we prove that the convergence rate can typically be faster than $n^{-1/2}$. Additionally, we also obtain a {\em relative} bound on the error. A secondary goal, for which we present similar contributions, is to obtain convergence bounds for the partial sums of the biggest or smallest eigenvalues of the Gram matrix towards eigenvalues of the corresponding kernel operator. These quantities are naturally linked to the KPCA procedure; furthermore these results can have applications to the study of various other kernel algorithms. The results are presented in a functional analytic framework, which is suited to deal rigorously with reproducing kernel Hilbert spaces of infinite dimension.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:1336
Deposited By:Gilles Blanchard
Deposited On:28 November 2005