Training Invariant Support Vector Machines using Selective Sampling
In this chapter we present the combination of two approaches to build a very large SVM. The first method proposes a strategy for handling invariances in SVMs. It is based on the well-known idea that small deformation of examples should not change their class. The deformed samples are selected or discarded during learning (it is selective sampling). The second approach is LASVM. It is an efficient online algorithm (each training point is only seen once) that also uses selective sampling. We present state-of-the-art results obtained on a handwritten digit recognition problem with 8 millions points on a single processor. This work also demonstrates that online SVMs can effectively handle really large databases.