Online appearance-based face and facial feature tracking
We propose a simple framework that utilizes online appearance models for 3D face and facial feature tracking with a deformable model. Adapting the geometrical parameters for each frame adopts a steepest ascent method in the observation likelihood using a local exhaustive and directed search in the parameter space. The observation likelihood is based on the current appearance and the registered images. The developed framework is straightforward and has the following advantages. First, it does not require any a priori statistical facial texture. Second, it does not require any a priori transition model for the 3D motion. Video sequences featuring large head motions, large facial animations, and external illumination variations are successfully tracked, which demonstrate the efficiency of the developed framework.