Please use this identifier to cite or link to this item:
Title: Object aware learning for object detection in bad weather conditions (part 1)
Authors: Mittal Ishan 
Keywords: Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Mittal Ishan (2022). Object aware learning for object detection in bad weather conditions (part 1). Final Year Project (FYP), Nanyang Technological University, Singapore.
Abstract: Detecting Objects in a variety of image settings has become extremely important in achieving autonomy in smart systems that are being employed in every sector. While the state-of-the-art models show immense progress in detecting objects of various shapes, sizes, and orientations, they may not account for different image settings that affect image quality. This project aims to utilize domain knowledge (in this case the knowledge of objects present on the road) in order to enhance current object detection algorithms when trained on images of inadequate quality (especially under adverse weather conditions). This report focuses on establishing a thorough understanding of current object detection algorithms including their architecture, implementation, and results on standard datasets (COCO and ImageNet), followed by a discussion of our methodology of introducing auxiliary features into standard datasets to utilize domain knowledge in order to improve accuracy of models. Finally, the report evaluates the models trained using suggested methodology to check its effectiveness and analyse shortcomings.
Schools: School of Electrical and Electronic Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP Final Report DR NTU - Mittal Ishan.pdf
  Restricted Access
5.43 MBAdobe PDFView/Open

Page view(s)

Updated on Sep 23, 2023


Updated on Sep 23, 2023

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.