Kullback-Leibler Divergence Estimation of Continuous Distributions
In: IEEE International Symposium on Information Theory (ISIT), July 2008, Toronto, Canada.
We present a method for estimating the KL divergence
between continuous densities and we prove it converges
almost surely. Divergence estimation is typically solved estimating
the densities first. Our main result shows this intermediate step is
unnecessary and that the divergence can be either estimated using
the empirical cdf or k-nearest-neighbour density estimation,
which does not converge to the true measure for finite k. The
convergence proof is based on describing the statistics of our
estimator using waiting-times distributions, as the exponential
or Erlang. We illustrate the proposed estimators and show how
they compare to existing methods based on density estimation,
and we also outline how our divergence estimators can be used
for solving the two-sample problem.