Linear Support Vector Machines
Lubor Ladicky and Philip Torr
In: ICML 2011, 28 June- 2 July 2011, Washington.
Linear support vector machines (svms) have become popular for solving classication tasks due to their fast and simple online application to large scale data sets. However,
many problems are not linearly separable. For these problems kernel-based svms are often used, but unlike their linear variant they suer from various drawbacks in terms of
computational and memory eciency. Their response can be represented only as a function of the set of support vectors, which has been experimentally shown to grow linearly
with the size of the training set. In this paper we propose a novel locally linear svm classier with smooth decision boundary and bounded curvature. We show how the functions
dening the classier can be approximated using local codings and show how this model can be optimized in an online
fashion by performing stochastic gradient descent with the same convergence guarantees as standard gradient descent method for linear svm. Our method achieves comparable
performance to the state-of-the-art whilst being signicantly faster than competing kernel svms. We generalise this model to locally finite dimensional kernel svm.