PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

The Kernel recursive least squares algorithm
Yaakov Engel, Shie Mannor and Ron Meir
IEEE Tran. Signal Processing Volume 52, Number 8, pp. 2275-2285, 2004.


We present a non-linear kernel-based version of the Recursive Least Squares (RLS) algorithm. Our Kernel-RLS (KRLS) algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum meansquared- error regressor. Sparsity of the solution is achieved by a sequential sparsication process that admits into the kernel representation a new input sample only if its feature space image cannot be suciently well approximated by combining the images of previously admitted samples. This sparsication procedure is crucial to the operation of KRLS, as it allows it to operate on-line, and by eectively regularizing its solutions. A theoretical analysis of the sparsication method reveals its close anity to kernel PCA, and a data-dependent loss bound is presented, quantifying the generalization performance of the KRLS algorithm. We demonstrate the performance and scaling properties of KRLS and compare it to a stateof- the-art Support Vector Regression algorithm, using both synthetic and real data. We additionally test KRLS on two signal processing problems in which the use of traditional least-squares methods is commonplace: Time series prediction and channel equalization.

EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Theory & Algorithms
ID Code:909
Deposited By:Ron Meir
Deposited On:06 January 2005