Visual Odometry and Map Correlation
Published by IEEE Computer Society
In this paper, we study how estimates of ego-motion based on feature tracking (visual odometry) can be improved using a rough (low accuracy) map of where the observer has been. We call the process of aligning the visual ego-motion with the map locations as map correlation. Since absolute estimates of camera position are unreliable, we use stable local information such as change in orientation to perform the alignment. We also detect when the observer’s path has crossed back on itself, which helps improve both the visual odometry estimates and the alignment between the video and map sequences. The final alignment is computed using a graphical model whose MAP estimate is inferred using loopy belief propagation. Results are presented on a number of indoor and outdoor sequences.
Copyright © 2007 IEEE. Reprinted from IEEE Computer Society. This material is posted here with permission of the IEEE. Internal or personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution must be obtained from the IEEE by writing to email@example.com. By choosing to view this document, you agree to all provisions of the copyright laws protecting it.