Some improvements on NN based classifiers in metric spaces.
The nearest neighbour (NN) and k-nearest neighbour (k-NN) classification rules have been widely used in Pattern Recognition due to its simplicity and good behaviour. Exhaustive nearest neighbour search may become unpractical when facing large training sets, high dimensional data or expensive dissimilarity measures. During the last years a lot of fast NN search algorithms have been developed to overcome those problems, and many of them are based on traversing a data structure (usually a tree) testing several candidates until the nearest neighbour is found. When these algorithms are extended to find the k nearest neighbours, the classification time increases with the value of k. In this paper we propose a new classification rule that makes use of the prototypes that are selected by these algorithms in a 1-NN search as candidates to nearest neighbour. To illustrate the behaviour of this rule, serveral fast and widely known NN search algorithms have been extended with it, obtaining classification results similar to those of a k-NN (k>1) classifier without the extra computational overhead. Also, previous work on aproximate NN search for vector spaces has been extended to algorithms suitable for general metric spaces, and has been combined with the new classification rule.