PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Supervised Feature Selection via Dependence Estimation
Le Song, Alex Smola, Arthur Gretton, karsten borgwardt and Justin Bedo
In: ICML 2007, 20 June - June 24 2007, Oregon USA.

Abstract

We introduce a framework for filtering features that employs the Hilbert-Schmidt Independence Criterion (HSIC) as a measure of dependence between the features and the labels. The key idea is that good features should maximise such dependence. Feature selection for various supervised learning problems (including classification and regression) is unified under this framework, and the solutions can be approximated using a backward-elimination algorithm. We demonstrate the usefulness of our method on both artificial and real world datasets.

EPrint Type:Conference or Workshop Item (Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Theory & Algorithms
ID Code:3137
Deposited By:Arthur Gretton
Deposited On:21 December 2007