Entropy Testing is Efficient
Peter Harremoes and Igor Vajda
In: Proceedings ISIT 2007 (2007) IEEE , pp. 1841-1846. ISBN 1-4244-1429-6

## Abstract

This paper compares the power divergence statistics of orders $\alpha>1$ with the information divergence statistic in the problem of testing the uniformity of a distribution. In this problem the information divergence statistic is equivalent to the entropy statistic. Extending some previously established results about information diagrams, it is proved that the information divergence statistic in this problem is more efficient in the Bahadur sense than any power divergence statistic of order $\alpha>1.$ This means that the entropy provides in this sense the most efficient way of characterizing the uniformity of a distribution.