PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

OP-KNN: Theory and Applications
Qi Yu, Yoan Miche, Antti Sorjamaa, Alberto Guillen, Amaury Lendasse and Eric Severin
Advances in Artificial Neural Systems 2010.

Abstract

This paper presents a methodology named Optimally Pruned K-Nearest Neighbors (OP-KNN) which as the advantage of competing with state of the art methods while remaining fast. It builds a one hiddenlayer feedforward neural network using K-Nearest Neighbors as kernels to perform regression. Multiresponse Sparse Regression (MRSR) is used in order to rank each kth nearest neighbor and finally Leave-One-Out estimation is used to select the optimal number of neighbors and to estimate the generalization performances. Since computational time of this method is small, this paper presents a strategy using OP-KNN to perform Variable Selection which is tested successfully on eight real life data sets from different application fields. In summary, the most significant characteristics of this method is that it provides good performance and a comparatively simple model at extremely high learning speed.

PDF - PASCAL Members only - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:6654
Deposited By:Amaury Lendasse
Deposited On:08 March 2010