PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Kernel Entropy Based Unsupervised Spectral Feature Selection
Zhihong Zhang and Edwin Hancock
International Journal of Pattern Recognition and Artificial Intelligence 2012.

Abstract

Most existing feature selection methods focus on ranking individual features based on a utility criterion, and select the optimal feature set in a greedy manner. However, the feature combinations found in this way do not give optimal classification performance, since they neglect the correlations among features. In an attempt to overcome this problem, we develop a novel feature selection technique using the spectral data transformation and by using $\ell_{1}$-norm regularized models for subset selection. Specifically, we propose a new two-step spectral regression technique for unsupervised feature selection. In the first step, we use kernel entropy component analysis (kECA) to transform the data into a lower-dimensional space so as to improve class separation. Second, we use $\ell_{1}$-norm regularization to select the features that best align with the data embedding resulting from kECA. The advantage of kECA is that dimensionality reducing data transformation maximally preserves entropy estimates for the input data whilst also best preserving the cluster structure of the data. Using $\ell_{1}$-norm regularization, we cast feature discriminant analysis into a regression framework which accommodates the correlations among features. As a result, we can evaluate joint feature combinations, rather than being confined to consider them individually. Experimental results demonstrate the effectiveness of our feature selection method on a number of standard face data-sets.

EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Machine Vision
ID Code:9568
Deposited By:Zhihong Zhang
Deposited On:28 August 2012