Please use this identifier to cite or link to this item:
Title: GAN for object detection under rainy conditions
Authors: Kalieperumal, Vikneswaran
Keywords: DRNTU::Engineering::Electrical and electronic engineering
Issue Date: 2019
Abstract: Artificial Intelligence (AI) is gaining prominence in our daily life. The world is amidst of trying to replace day to day regular activities. One such activity is driving. In order to achieve autonomous driving, object detection is one of the crucial aspect to improve upon. Object detection is also an important aspect of AI. The performance of object detector deteriorates during rainy conditions as they are not designed for adverse weather conditions. The objective of this project is to develop an object detection system that is able to detect objects during rainy conditions with higher accuracy. A deep learning model will be trained and tested with data that will be collected for clear weather conditions and together with pre-collected rain data. This trained model will be used to generate de-rained (rainless) image from a rainy image. A classifier will be trained with dataset to be able to classify various objects. After training, a dataset of rain image and its corresponding de-rained image will be fed to the classifier. Accuracy of the reading will be tabulated and conclusion on the result will be discussed. Lastly, how the system can be improved to achieve better results and function in a more robust manner will be discussed.
Schools: School of Electrical and Electronic Engineering 
Organisations: A*STAR Institute for Infocomm Research
Rights: Nanyang Technological University
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
GAN for Object Detection under Rainy Conditions.pdf
  Restricted Access
FYP Report1.88 MBAdobe PDFView/Open

Page view(s)

Updated on Jun 19, 2024


Updated on Jun 19, 2024

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.