PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Optimal Error Exponents in Hidden Markov Models Order Estimation
Elisabeth Gassiat and Stephane Boucheron
IEEE Transactions on nformation Theory Volume 48, Number 4, pp. 964-980, 2003.


We consider the estimation of the number of hidden states (the {order) of a discrete-time finite alphabet Hidden Markov Model (HMM). The estimators we investigate are related to code-based order estimators: penalized maximum likelihood estimators and penalized versions of the mixture estimator introduced by Liu and Narayan. We prove strong consistency of those estimators without assuming any a priori upper bound on the order and smaller penalties than previous works. We prove a version of Stein's lemma for HMM order estimation and derive an upper-bound on under-estimation exponents. Then we prove that this upper-bound can be achieved by the penalized maximum likelihood estimator and by the penalized mixture estimator. The proof of the latter result gets around the elusive nature of the maximum likelihood in HMM by resorting to large deviation techniques for empirical processes. Finally, we prove that for any consistent HMM order estimator, for most HMM, the over-estimation exponent is null.

Postscript - Requires a viewer, such as GhostView
EPrint Type:Article
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Computational, Information-Theoretic Learning with Statistics
ID Code:601
Deposited By:Elisabeth Gassiat
Deposited On:29 December 2004