Please use this identifier to cite or link to this item:
Title: Instance segmentation for roadside objects using a simulation environment
Authors: Wang, Sijie
Keywords: Engineering::Computer science and engineering::Computing methodologies::Image processing and computer vision
Engineering::Electrical and electronic engineering
Issue Date: 2021
Publisher: Nanyang Technological University
Source: Wang, S. (2021). Instance segmentation for roadside objects using a simulation environment. Master's thesis, Nanyang Technological University, Singapore.
Project: D-255-20211-02914
Abstract: In the autonomous driving system, the understanding of traffic is always an important task. Especially in the field of the detection and recognition for roadside objects, it can help to guide vehicles, prevent them from deviating, and assist them in positioning and localization. To achieve the goal, in these years, deep learning and computer vision technology have been powerful tools for instance segmentation for roadside objects. In addition, with the continuous advancement of computer technology and hardware, it has been possible to train and test instance segmentation algorithms in autonomous driving simulation environments. Compared with collecting data in real environment, the simulation environment can directly generate data through computing, which saves a lot of manpower, time and financial resources. In this dissertation, a method for generating instance segmentation labels using point clouds and semantic labels is proposed, and the instance segmentation algorithm, Mask R-CNN, is evaluated on the dataset generated from CARLA simulator. The final result shows that Mask R-CNN on CARLA has achieved the best performance compared with other baselines.
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
  Restricted Access
3.06 MBAdobe PDFView/Open

Page view(s)

Updated on Jun 23, 2022


Updated on Jun 23, 2022

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.