Omnidirectional Sparse Visual Path Following with Occlusion-Robust Feature Tracking
Omnidirectional vision sensors are very attractive for autonomous robots because they offer a rich source of environment information. The main challenge in using these for mobile robots is managing this wealth of information. A relatively recent approach is the use of fast wide baseline local features, which we developed and use in the novel sparse visual path following method described in this paper. These local features have the great advantage that they can be recognized even if the viewpoint differs significantly. This opens the door to a memory efficient description of a path by sparsely sampling it with images. We propose a method for reexecution of these paths by a series of visual homing operations. Motion estimation is done by simultaneously tracking the set of features, with recovery of lost features by backprojecting them from a local sparse 3D feature map. This yields a navigation method with unique properties: it is accurate, robust, fast, and without odometry error build-up.