Scale-disstected Pose Estimation in Visual Odometry

Full Metadata

Overview

Title
Scale-disstected Pose Estimation in Visual Odometry
Contributors
Yuan, Rong (creator)
Kimia, Benjamin (Advisor)
Brown University. Engineering: Electrical Sciences and Computer Engineering (sponsor)
Doi
10.7301/Z0ZW1JD3
Copyright Date
2017
Abstract
Traditional visual odometry approaches often rely on estimating the world in the form a 3D cloud of points from keyframes, which are then projected onto other frames to determine their absolute pose. The resulting trajectory is obtained from the integration of these incremental estimates. In this process, both in the initial world reconstruction as well as in the subsequent PnP projection, a rotation matrix and a translation vector are the unknowns that are solved via a numerical process. We observe that the involvement of all these variables in the numerical process is unnecessary, costing both computational time and accuracy. Rather, the relative pose of pairs of frames can be independently estimated from a set of common features, up to scale, with high accuracy. This scale parameter is a free parameter for each pair of frames, whose estimation is the only obstacle the integration of these local estimates. This paper presents an approach for relating this free parameter for each neighboring pair of frames and therefore integrating the entire estimation process, leaving only a single global scale variable. The odometry results are more accurate and the computational efficiency is significantly improved due to the analytic solution of the relative pose as well as relative scale.
Keywords
Computer Vision
Visual Odometry
Notes
Thesis (Sc. M.)--Brown University, 2017
Extent
viii, 30 p.

Citation

Yuan, Rong, "Scale-disstected Pose Estimation in Visual Odometry" (2017). Electrical Sciences and Computer Engineering Theses and Dissertations. Brown Digital Repository. Brown University Library. https://doi.org/10.7301/Z0ZW1JD3

Relations

Collection: