A Bayesian framework for extracting human gait using strong prior knowledge
Ziheng Zhou, Adam Prügel-Bennett and Robert I Damber
IEEE Transactions on Pattern Analysis and Machine Intelligence
Extracting full-body motion of walking people from monocular video sequences in complex, real-world environments is an important and difficult problem, going beyond simple tracking, whose satisfactory solution demands an appropriate balance between use of prior knowledge and learning from data. We propose a consistent Bayesian framework for introducing strong prior knowledge into a system for extracting human gait. The strong prior is built from a simple articulated model having both time-invariant (static) and time-variant (dynamic) parameters. The statistics of the parameters are learned from high-quality (indoor
laboratory) data, and the Bayesian framework then allows us to `bootstrap' to performing on the noisy images typical of cluttered, outdoor scenes. To achieve automatic fitting, we use a hidden Markov model to detect the phases of images in a walking cycle. We demonstrate our approach on both the high-quality indoor and noisy outdoor video data, as well as some difficult synthetic data. Results are quantified in terms of chamfer distance and average pixel error between automatically extracted body points and corresponding points hand-labeled after gait extraction.