Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/155012
Title: | Robust navigation for mobile robot during day and night | Authors: | Qing, Yuzhou | Keywords: | Engineering::Electrical and electronic engineering | Issue Date: | 2021 | Publisher: | Nanyang Technological University | Source: | Qing, Y. (2021). Robust navigation for mobile robot during day and night. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/155012 | Abstract: | This dissertation aims to provide a solution for robust navigation for mobile robot during day and night. The project consists of two main components - point clouds se mantic segmentation and imitation learning for velocity prediction. Features of point clouds and some related work of point clouds semantic segmentation are introduced in chapter 2. In chapter 3, we focus on the specific technologies including deep learn ing techniques and the structure of neural network we used in this project. For point clouds semantic segmentation part, we compare three similar network. And for veloc ity prediction part, we have tested MLP, LSTM and multi-head LSTM networks. An imaged based convolutional neural network is also used to make contrast experiment. DeepLab V3 which is used to automatically label the point clouds in this project is also introduced in this chapter. The results and analysis of all experiments are given in detail in chapter 4. Finally, chapter 5 makes a conclusion of all work done, presents current problems and possible solutions for future work. | URI: | https://hdl.handle.net/10356/155012 | Fulltext Permission: | restricted | Fulltext Availability: | With Fulltext |
Appears in Collections: | EEE Theses |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
NTU_Master_Dissertation_.pdf Restricted Access | 6.29 MB | Adobe PDF | View/Open |
Page view(s)
42
Updated on Jun 29, 2022
Download(s)
9
Updated on Jun 29, 2022
Google ScholarTM
Check
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.