PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Diversity in Neural Network Ensembles
Gavin Brown
(2004) PhD thesis, University of Birmingham.

Abstract

This work studies the phenomenon of "diversity" in ensemble learning. In ensembles of regression estimators, the measurement of diversity is given by the bias-variance-covariance decomposition. In classification ensembles, there is no such neat theory. The objective of the thesis is to study how to measure and enforce appropriate diversity in both cases. As a focal point we choose one successful algorithm for neural network enembles, Negative Correlation Learning, which uses a penalty regularization term to control diversity. We find a solid theoretical grounding for this algorithm by linking its penalty term to the bias-variance-covariance decomposition, demonstrating how NCL can explicitly manage the accuracy-diversity tradeoff to achieve lower generalization error. We furthermore show bounds on the configurable parameters of the algorithm, based on the condition of the Hessian matrix. The findings are not solely properties of neural network ensembles, but rather of the squared loss function - we show that any generalised regression estimator can make use of the NC regularization framework.

PDF (PDF) - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Thesis (PhD)
Additional Information:Winner of British Computer Society Distinguished Dissertation Award, 2004.
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:4765
Deposited By:Gavin Brown
Deposited On:24 March 2009