Please use this identifier to cite or link to this item:
Title: Mobile robot ego motion estimation using RANSAC-based ceiling vision
Authors: Wang, Han
Mou, Wei
Ly, Minh Hiep
Lau, Michael Wai Shing
Seet Gim Lee, Gerald
Wang, Danwei
Issue Date: 2012
Source: Wang, H., Mou, W., Ly, M. H., Lau, M. W. S., Seet, G. L. G., & Wang, D. (2012). Mobile robot ego motion estimation using RANSAC-based ceiling vision. 2012 24th Chinese Control and Decision Conference (CCDC), 1939 - 1943.
Abstract: Visual odometry is a commonly used technique for recovering motion and location of the robot. In this paper, we present a robust visual odometry estimation approach based on ceiling view from 3D camera (Kinect). We extracted Speedup Robust Features (SURF) from the monocular image frames retrieved from the camera. SURF features from two consecutive frames are matched by finding the nearest neighbor using KD-tree. 3D information of the SURF features are retrieved using the camera's depth map. The 3D affine transformation model is estimated between these two frames based on Random Sample Consensus (RANSAC) method. All inliers are then used to reestimate the relative transformation between two frames by Singular Value Decomposition (SVD). Given this, the global robot position and orientation can be calculated. Experimental results demonstrate the performance of the proposed algorithm in real environments.
DOI: 10.1109/CCDC.2012.6244312
Fulltext Permission: none
Fulltext Availability: No Fulltext
Appears in Collections:EEE Conference Papers

Citations 50

Updated on Jan 22, 2023

Page view(s) 20

Updated on Jan 29, 2023

Google ScholarTM




Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.