On the Bahadur-efficient Testing of Uniformity by means of Entropy ## AbstractThis paper compares the power divergence statistics of order $\alpha >1$ with the information divergence statistics in the problem of testing the uniformity of a dsitribution. In this problem the information divergence statistics is equivalent to the entropy statistics. Extending some previously established results about information diagrams, it is proved that the information divergence statistics in this problem is more efficient in hte Bahadur sense than any other power divergence statistics of order $\alpha > 1$. This means that the entropy provides in this sense the most efficient way of characterizing the uniform distribution.
[Edit] |