PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Fast discriminative component analysis for comparing examples
Jaakko Peltonen, Jacob Goldberger and Samuel Kaski
In: Learning to Compare Examples - NIPS 2006 Workshop, 8 Dec 2006, Whistler, Canada.


Two recent methods, Neighborhood Components Analysis (NCA) and Informative Discriminant Analysis (IDA), search for a class-discriminative subspace or discriminative components of data, equivalent to learning of distance metrics invariant to changes perpendicular to the subspace. Constraining metrics to a subspace is useful for regularizing the metrics, and for dimensionality reduction. We introduce a variant of NCA and IDA that reduces their computational complexity from quadratic to linear in the number of data samples, by replacing their purely non-parametric class density estimates with semiparametric mixtures of Gaussians. In terms of accuracy, the method is shown to perform as well as NCA on benchmark data sets, outperforming several popular linear dimensionality reduction methods.

PDF - PASCAL Members only - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Conference or Workshop Item (Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:2564
Deposited By:Jaakko Peltonen
Deposited On:22 November 2006