A Fast Normalized Maximum Likelihood Algorithm for Multinomial Data
Petri Kontkanen and Petri Myllymäki
In: 19th International Joint Conference on Artificial Intelligence, 30 Jul - 5 Aug 2005, Edinburgh, Scotland.
Stochastic complexity of a data set is defined as the shortest
possible code length for the data obtainable by using some fixed set
of models. This measure is of great theoretical and practical
importance as a tool for tasks such as model selection or data
clustering. In the case of multinomial data, computing the modern version of stochastic complexity, defined as the Normalized Maximum Likelihood (NML) criterion, requires computing a sum with an exponential number of terms. Furthermore, in order to apply NML in practice, one often needs to compute a whole table of these exponential sums. In our previous work, we were able to compute this table by a recursive algorithm. The purpose of this paper is to significantly improve the time complexity of this algorithm. The techniques used here are based on the discrete Fourier transform and the convolution theorem.