In this paper we study how estimates of ego-motion based on feature tracking (visual odometry) can be improved using a rough (low accuracy) map of where the observer has been. We call the process of aligning the visual ego-motion with the map locations as map correlation. Since absolute estimates of camera position are unreliable, we use stable local information such as change in orientation to perform the alignment. We also detect when the observer's path has crossed back on itself which helps improve both the visual odometry estimates and the alignment between the video and map sequences. The final alignment is computed using a graphical model whose MAP estimate is inferred using loopy belief propagation. Results are presented on a number of indoor and outdoor sequences.
Visual odometry and map correlation
01.01.2004
1179635 byte
Aufsatz (Konferenz)
Elektronische Ressource
Englisch
Visual Odometry and Map Correlation
British Library Conference Proceedings | 2004
|Correlation-based visual odometry for ground vehicles
British Library Online Contents | 2011
|British Library Conference Proceedings | 2004
|IEEE | 2004
|TIBKAT | 2018
|