Please use this identifier to cite or link to this item:
Full metadata record
DC FieldValueLanguage
dc.contributor.authorWang, Yihanen_US
dc.identifier.citationWang, Y. (2021). 3D object detection for autonomous vehicle. Master's thesis, Nanyang Technological University, Singapore.
dc.description.abstract3D object detection plays an important role in autonomous driving, while most state-of-the-art researches are developed based on 64-line LiDARs. However, the cost of high-resolution LiDARs are several magnitude higher than the low- resolution applied LiDARs on the makes the current research majority of low-cost robotics hard to be widely platforms. To minimize the gap between current research and real world applications as well as meet the needs of autonomous sweeper implementing which the target detection function on our is equipped with a 16-line LiDAR, in this work, traditional machine learning algorithms based on RANSAC is firstly tested. Then a image-based detector detectors are taken into experiments. methods are applied. together with six pointcloud-based After that, two data-density-based Finally, one image-based as well as two multi-modal fusion based methods are proposed in this work. All the methods above mentioned are tested on both open source dataset and self-collected NTU dataset.en_US
dc.publisherNanyang Technological Universityen_US
dc.subjectEngineering::Electrical and electronic engineering::Computer hardware, software and systemsen_US
dc.title3D object detection for autonomous vehicleen_US
dc.typeThesis-Master by Courseworken_US
dc.contributor.supervisorWang Dan Weien_US
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.description.degreeMaster of Science (Computer Control and Automation)en_US
item.fulltextWith Fulltext-
Appears in Collections:EEE Theses
Files in This Item:
File Description SizeFormat 
  Restricted Access
2.64 MBAdobe PDFView/Open

Page view(s)

Updated on Jul 2, 2022


Updated on Jul 2, 2022

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.