Finite dimensional projection for classification and statistical learning
Gilles Blanchard and Laurent Zwald
IEEE transactions on information theory
A new method for the binary classification problem is studied. It
relies on empirical minimization of the hinge loss over an
increasing sequence of finite-dimensional spaces. A suitable
dimension is picked by minimizing the regularized loss, where the
regularization term is proportional to the dimension.
An oracle-type inequality is established, which ensures
adequate convergence properties of the method.
We suggest to select the considered sequence of subspaces by applying
kernel principal components analysis. In this case the asymptotical convergence
rate of the method can be better than what is known for the Support
Vector Machine. Exemplary experiments are presented on benchmark
datasets where the practical results of the method are comparable to