Please use this identifier to cite or link to this item:
|Title:||A framework for fast and robust visual odometry||Authors:||Wu, Meiqing
|Keywords:||Engineering::Computer science and engineering||Issue Date:||2017||Source:||Wu, M., Lam, S. & Srikanthan, T. (2017). A framework for fast and robust visual odometry. IEEE Transactions On Intelligent Transportation Systems, 18(12), 3433-3448. https://dx.doi.org/10.1109/TITS.2017.2685433||Journal:||IEEE Transactions on Intelligent Transportation Systems||Abstract:||Knowledge of the ego-vehicle's motion state is essential for assessing the collision risk in advanced driver assistance systems or autonomous driving. Vision-based methods for estimating the ego-motion of vehicle, i.e., visual odometry, face a number of challenges in uncontrolled realistic urban environments. Existing solutions fail to achieve a good tradeoff between high accuracy and low computational complexity. In this paper, a framework for ego-motion estimation that integrates runtime-efficient strategies with robust techniques at various core stages in visual odometry is proposed. First, a pruning method is employed to reduce the computational complexity of Kanade-Lucas-Tomasi (KLT) feature detection without compromising on the quality of the features. Next, three strategies, i.e., smooth motion constraint, adaptive integration window technique, and automatic tracking failure detection scheme, are introduced into the conventional KLT tracker to facilitate generation of feature correspondences in a robust and runtime efficient way. Finally, an early termination condition for the random sample consensus (RANSAC) algorithm is integrated with the Gauss-Newton optimization scheme to enable rapid convergence of the motion estimation process while achieving robustness. Experimental results based on the KITTI odometry data set show that the proposed technique outperforms the state-of-the-art visual odometry methods by producing more accurate ego-motion estimation in notably lesser amount of time.||URI:||https://hdl.handle.net/10356/147484||ISSN:||1524-9050||DOI:||10.1109/TITS.2017.2685433||Rights:||© 2017 Institute of Electrical and Electronics Engineers (IEEE). All rights reserved.||Fulltext Permission:||none||Fulltext Availability:||No Fulltext|
|Appears in Collections:||SCSE Journal Articles|
Updated on May 11, 2021
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.