Please use this identifier to cite or link to this item:
Title: Towards robust monocular depth estimation in the wild
Authors: Zheng, Zhenkai
Keywords: Engineering::Computer science and engineering
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Zheng, Z. (2022). Towards robust monocular depth estimation in the wild. Final Year Project (FYP), Nanyang Technological University, Singapore.
Project: SCSE21-0208
Abstract: This research project aims to create a robust monocular depth estimation model that is capable of predicting accurate relative depth maps in the wild. The project will highlight the significance of the training dataset used during supervised model training by comparing models trained with our new mixed dataset in the wild with a common open-source dataset such as NYU. Experiments will be conducted on several models trained during the span of the project and both quantitative and qualitative evaluations will be performed. Various network architectures, loss functions, and modules will be explored and discussed in this project. As a result, we obtain the optimal model that performs greatly in both absolute and relative errors. The model trained in this project will be a deep Convolutional Neural Network (CNN) with encoder-decoder architecture that could theoretically accept any arbitrary input. Hence, this research project will include a model evaluation on both low and high-resolution images. Current monocular depth estimation solutions proposed are capable of creating good performing models on their respective testing data but are usually less effective in a “real world” environment. Our research project will look to overcome such constraints and produce a model that could train a model with relatively better depth estimation in the wild
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP report_Zheng Zhenkai.pdf
  Restricted Access
3.02 MBAdobe PDFView/Open

Page view(s)

Updated on Jun 26, 2022


Updated on Jun 26, 2022

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.