PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Margin-based discriminant dimensionality reduction for visual recognition
Hakan Cevikalp, William Triggs, Frederic Jurie and Robi Polikar
In: CVPR 2008, 24-26 June 2008, Anchorage, Alaska.


Nearest neighbour classifiers and related kernel methods often perform poorly in high dimensional problems because it is infeasible to include enough training samples to cover the class regions densely. In such cases, test samples often fall into gaps between training samples where the nearest neighbours are too distant to be good indicators of class membership. One solution is to project the data onto a discriminative lower dimensional subspace. We propose a gap-resistant nonparametric method for finding such subspaces: first the gaps are filled by building a convex model of the region spanned by each class -- we test the affine and convex hulls and the bounding disk of the class training samples -- then a set of highly discriminative directions is found by building and decomposing a scatter matrix of weighted displacement vectors from training examples to nearby rival class regions. The weights are chosen to focus attention on narrow margin cases while still allowing more diversity and hence more discriminability than the 1D linear Support Vector Machine (SVM) projection. Experimental results on several face and object recognition datasets show that the method finds effective projections, allowing simple classifiers such as nearest neighbours to work well in the low dimensional reduced space.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Conference or Workshop Item (Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Machine Vision
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:4364
Deposited By:William Triggs
Deposited On:13 March 2009