Speeding Up Planning in Markov Decision Processes via Automatically Constructed Abstractions
Alejandro Isaza, Csaba Szepesvari, Vadim Bulitko and Russ Greiner
In: UAI-08, 9-12 July 2008, Helsinki, Finland.
In this paper, we consider planning in stochastic shortest path problems, a subclass of Markov Decision Problems (MDP). We focus on medium-size problems whose state space can be fully enumerated. This problem has numerous important applications, such as navigation and planning under uncertainty. We propose a new approach for constructing a multi-level hierarchy of progressively simpler abstractions of the original problem. Once computed, the hierarchy can be used to speed up planning by first finding a policy for the most abstract level and then recursively refining it into a solution to the original problem. This approach is fully automated and delivers a speed-up of two orders of magnitude over a state-of-the-art MDP solver on sample problems while returning near-optimal solutions.