Fast MCMC sampling for Markov jump processes and continuous time Bayesian networks
Markov jump processes and continuous time Bayesian networks are important classes of continuous time dynamical systems. In this paper, we tackle the problem of inferring unobserved paths in these models by introducing a fast auxiliary variable Gibbs sampler. Our approach is based on the idea of uniformization, and sets up a Markov chain over paths by sampling a ﬁnite set of virtual jump times and then running a standard hidden Markov model forward ﬁltering backward sampling algorithm over states at the set of extant and virtual jump times. We demonstrate signiﬁcant computational beneﬁts over a state-of-the-art Gibbs sampler on a number of continuous time Bayesian networks.