PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Dimensionality reduction and generalization
Sofia Mosci, Lorenzo Rosasco and Alessandro Verri
In: ICML 2007(2007).


In this paper we investigate the regularization property of Kernel Principal Component Analysis (KPCA), by studying its application to supervised learning problems as a preprocessing step. We show that performing KPCA and then ordinary least squares on the projected data, a procedure known as kernel principal component regression (KPCR), is equivalent to spectral cut-off regularization, the regularization parameter being exactly the number of principal components to keep. Using probabilistic estimates for integral operators we can prove error estimates for KPCR and provide a parameter choice procedure allowing to proof consistency of the algorithm.\\

EPrint Type:Conference or Workshop Item (Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:3947
Deposited By:Lorenzo Rosasco
Deposited On:25 February 2008