PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Bias-Variance tradeoff in Hybrid Generative-Discriminative models
Guillaume Bouchard
Proceedings of the Sixth International Conference on Machine Learning and Applications (ICMLA'07) 2007.

Abstract

Given any generative classifier based on an inexact density model, we can define a discriminative counterpart that reduces its asymptotic error rate, while increasing the estimation variance. An optimal bias-variance balance might be found using Hybrid Generative-Discriminative (HGD) approaches. In these paper, these methods are defined in a unified framework. This allow us to find sufficient conditions under which an improvement in generalization performances is guaranteed. Numerical experiments illustrate the well fondness of our statements.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
??
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
Theory & Algorithms
ID Code:3070
Deposited By:Guillaume Bouchard
Deposited On:05 December 2007