Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/156769
Title: | Unsupervised domain adaptation for depth completion from sparse LiDAR scans depth map | Authors: | Geng, Yue | Keywords: | Engineering::Electrical and electronic engineering | Issue Date: | 2022 | Publisher: | Nanyang Technological University | Source: | Geng, Y. (2022). Unsupervised domain adaptation for depth completion from sparse LiDAR scans depth map. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/156769 | Abstract: | Depth completion aims to predict the distance between objects on an image and the camera capturing the image from a LiDAR scans depth input, and the distance is expressed as a dense depth map. Denser scans depth input leads to better prediction, while the cost of the corresponding LiDAR equipment will be more expensive, and the model trained by dense depth input performs badly on sparse depth input. Meanwhile, it is difficult to get dense ground truth annotations for training depth completion models. In this dissertation, an unsupervised domain adaptation method is proposed to improve the performance of the models with unannotated sparse depth input. The approach aligns the second-order statistics of the features generated by the convolution neural network, which is shared by dense and sparse depth input. Experiments based on the KITTI depth completion benchmark shows that the method can improve the performance of depth completion on sparse depth input. | URI: | https://hdl.handle.net/10356/156769 | Fulltext Permission: | restricted | Fulltext Availability: | With Fulltext |
Appears in Collections: | EEE Theses |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
Dissertation_GENGYUE.pdf Restricted Access | 1.29 MB | Adobe PDF | View/Open |
Page view(s)
24
Updated on May 19, 2022
Download(s)
3
Updated on May 19, 2022
Google ScholarTM
Check
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.