Please use this identifier to cite or link to this item:
Title: Vision based obstacle detection and mapping for unmanned surface vehicles
Authors: Mou, Xiaozheng
Keywords: DRNTU::Engineering::Electrical and electronic engineering
Issue Date: 2018
Source: Mou, X. (2018). Vision based obstacle detection and mapping for unmanned surface vehicles. Doctoral thesis, Nanyang Technological University, Singapore.
Abstract: An unmanned surface vehicle (USV) is a speed boat that comes with a suit of sensors to understand maritime environment and navigational intelligence to know how to navigate itself autonomously. In this thesis, we focus on exploring and enhancing the perception ability of USV based on vision. Detecting and mapping obstacles in maritime environments with high accuracy and robustness are two important issues for USV in real-life applications, such as surveillance and navigation. Solutions based on vision (cameras) are more cost-effective than its counter-part, radar. What's more, vision can compensate the blind area of radar at short distances. This thesis addresses the tasks of improving the accuracy and robustness of vision based obstacle detection and mapping in open sea. For obstacle detection, first, we present a novel image-based algorithm, in that the obstacle patches are separated from sea patches using the proposed patch distinctiveness measure, named as global sparsity potentials. Secondly, we develop a real-time long range obstacle detection and tracking system for USV based on binocular stereo vision, which improves the image-based methods in reducing false positives caused by noise (white wake, waves, sun reflection, etc.) and obtaining additional information of the detected obstacles' distances. In this system, we propose to implement the obstacle detection algorithm in an image-pyramid manner to speed up the processing, and we also propose a new solution for multiple-obstacle tracking with scale adapting and occlusion handling. Then, an approach by fusing 2D and 3D clues for further enhancing the performance of obstacle detection is proposed in the same binocular vision system. In this approach, the 2D and 3D information are combined in a weighting model, which gives more weights to the 2D detecting results when obstacles are distant, while gives more weights to the 3D detecting results when obstacles are nearby. After detecting the obstacles, it is necessary to build the obstacle map for the navigation of USV. To this end, we develop a system for obstacle mapping based on motion stereo vision, which raises the ranging ability from 500 meters in the previous binocular stereo system to 1,000 meters with an even larger baseline obtained from the travelling of USV. Moreover, the proposed motion stereo system eliminates the complicated calibration work and the bulky rig in binocular stereo. Integrating monocular camera with GPS and compass information in this proposed system, the world locations of the detected static obstacles are reconstructed when the USV is travelling, and finally an obstacle map is built. To achieve more accurate and robust performance, multiple pairs of frames are leveraged to synthesize the final reconstruction results in a weighting model. To the best of our knowledge, we are the first to address the task of monocular vision based obstacle mapping in maritime environment. Since there are few available public datasets for this research, we evaluate the efficiencies of the proposed algorithms on our own datasets. Experimental results verify that our methods are highly accurate and robust as compared to other methods. The thesis concludes with discussions to the presented research, and with suggestions to further studies in this field.
DOI: 10.32657/10356/73653
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
thesis(hardbound)_Mou Xiaozheng.pdf24.6 MBAdobe PDFThumbnail

Page view(s)

Updated on Feb 28, 2021

Download(s) 50

Updated on Feb 28, 2021

Google ScholarTM




Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.