Support Vector Regression for the Simultaneous Learning of a Multivariate Function and its Derivatives
In this paper, the problem of simultaneously approximating a function and its derivatives is formulated within the Support Vector Machine (SVM) framework. First, the problem is solved for a one-dimensional input space by using the ε-insensitive loss function and introducing additional constraints in the approximation of the derivative. Then, we extend the method to multi-dimensional input spaces by a multidimensional regression algorithm. In both cases, to optimize the regression estimation problem, we have derived an iterative re-weighted least squares (IRWLS) procedure that works fast for moderate-size problems. The proposed method shows that using the information about derivatives significantly improves the reconstruction of the function.