Bias-Variance tradeoff in Hybrid Generative-Discriminative models
Proceedings of the Sixth International Conference on Machine Learning and Applications (ICMLA'07)
Given any generative classifier based on an inexact density model, we can
define a discriminative counterpart that reduces its asymptotic error rate,
while increasing the estimation variance. An optimal bias-variance
balance might be found using Hybrid Generative-Discriminative (HGD) approaches. In these paper, these methods are defined in a unified framework.
This allow us to find sufficient conditions under which an improvement in
generalization performances is guaranteed. Numerical experiments illustrate
the well fondness of our statements.