PASCAL - Pattern Analysis, Statistical Modelling and Computational Learning

Speeding Up Planning in Markov Decision Processes via Automatically Constructed Abstractions
Alejandro Isaza, Csaba Szepesvari, Vadim Bulitko and Russ Greiner
In: UAI-08, 9-12 July 2008, Helsinki, Finland.

Abstract

In this paper, we consider planning in stochastic shortest path problems, a subclass of Markov Decision Problems (MDP). We focus on medium-size problems whose state space can be fully enumerated. This problem has numerous important applications, such as navigation and planning under uncertainty. We propose a new approach for constructing a multi-level hierarchy of progressively simpler abstractions of the original problem. Once computed, the hierarchy can be used to speed up planning by first finding a policy for the most abstract level and then recursively refining it into a solution to the original problem. This approach is fully automated and delivers a speed-up of two orders of magnitude over a state-of-the-art MDP solver on sample problems while returning near-optimal solutions.

PDF - Requires Adobe Acrobat Reader or other PDF viewer.
EPrint Type:Conference or Workshop Item (Paper)
Project Keyword:Project Keyword UNSPECIFIED
Subjects:Learning/Statistics & Optimisation
Theory & Algorithms
ID Code:4935
Deposited By:Csaba Szepesvari
Deposited On:24 March 2009