PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Mutual Information Criteria for Feature Selection
Zhihong Zhang and Edwin Hancock
SIMBAD, 2011 Volume 7005, pp. 235-249, 2011. ISSN 978-3-642-24470-4


In many data analysis tasks, one is often confronted with very high dimensional data. The feature selection problem is essentially a combinatorial optimization problem which is computationally expensive. To overcome this problem it is frequently assumed either that features independently influence the class variable or do so only involving pairwise feature interaction. In prior work \cite{zhang2011graph}, we have explained the use of a new measure called multidimensional interaction information (MII) for feature selection. The advantage of MII is that it can consider third or higher order feature interaction. Using dominant set clustering, we can extract most of the informative features in the leading dominant sets in advance, limiting the search space for higher order interactions. In this paper, we provide a comparison of different similarity measures based on mutual information. Experimental results demonstrate the effectiveness of our feature selection method on a number of standard data-sets.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Machine Vision
Theory & Algorithms
ID Code:8510
Deposited By:Zhihong Zhang
Deposited On:04 February 2012