The Kernel recursive least squares algorithm
We present a non-linear kernel-based version of the Recursive Least Squares (RLS) algorithm. Our Kernel-RLS (KRLS) algorithm performs linear regression in the feature space induced by a Mercer kernel, and can therefore be used to recursively construct the minimum meansquared- error regressor. Sparsity of the solution is achieved by a sequential sparsication process that admits into the kernel representation a new input sample only if its feature space image cannot be suciently well approximated by combining the images of previously admitted samples. This sparsication procedure is crucial to the operation of KRLS, as it allows it to operate on-line, and by eectively regularizing its solutions. A theoretical analysis of the sparsication method reveals its close anity to kernel PCA, and a data-dependent loss bound is presented, quantifying the generalization performance of the KRLS algorithm. We demonstrate the performance and scaling properties of KRLS and compare it to a stateof- the-art Support Vector Regression algorithm, using both synthetic and real data. We additionally test KRLS on two signal processing problems in which the use of traditional least-squares methods is commonplace: Time series prediction and channel equalization.