Splines with non positive kernels
Stéphane Canu, Cheng Soon Ong and Xavier Mary
Non parametric regression methods can be presented in two main clusters. The one of smoothing splines methods requiring positive kernels and the other one known as Nonparametric Kernel Regression allowing the use of non positive kernels such as the Epanechnikov kernel.
We propose a generalization of the smoothing spline method to include kernels which are still symmetric but not positive semi definite (they are called indefinite). The general relationship between smoothing splines, Reproducing Kernel Hilbert Spaces (RKHS) and positive kernels no longer exists with indefinite kernels. Instead the splines are associated with functional spaces called Reproducing Kernel Krein Spaces (RKKS) endowed with an indefinite inner product and thus not directly associated with a norm. Smoothing splines in RKKS have many of the interesting properties of splines in RKHS, such as orthogonality, projection and representer theorem.
We show that smoothing splines can be defined in RKKS as the regularized solution of the interpolation problem. Since no norm is available in an RKKS, Tikhonov regularization cannot be defined. Instead, we propose the use of conjugate gradient type iterative methods, with early stopping as a regularization mechanism. Several iterative algorithms are collected which can be used to solve the optimization problems associated with learning in indefinite spaces. Some preliminary experiments with indefinite kernels for spline smoothing reveal the computational efficiency of this approach