Local subspace classifiers: linear and nonlinear approaches
Hakan Cevikalp, Diane Larlus, Matthijs Douze and Frederic Jurie
In: IEEE Workshop on Machine Learning for Signal Processing, Greece(2007).
The K-local hyperplane distance nearest neighbor algorithm (HKNN) is a local classification method which builds nonlinear decision surfaces directly in the original sample space by using local linear manifolds. Although the HKNN method has been succesfully applied in several classification tasks, it is not possible to employ distance metrics other than the Euclidean distances in this scheme, which can be considered as a major limitation of the method. In this paper we formulate the HKNN method in terms of subspaces. Advantages of the subspace formulation of the method are two-fold: First, it enables us to propose a variant of the HKNN algorithm, the Local Discriminative Common Vector (LDCV) method, which is more suitable for classification tasks where classes have similar intra-class variations. Second, the HKNN method along with the proposed method can be extended to the nonlinear case based on subspace concepts. As a result of the non-linearization process, one may utilize a wide variety of distance functions in those local classifiers. We tested the proposed methods on several classification tasks. Experimental results show that the proposed methods yield comparable or better results than the Support Vector Machine (SVM) classifier and its local counterpart SVM-KNN.