Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/162812
Title: VAE hyperparameter optimization in optical flow based OOD detection
Authors: Goh, Ting Qi
Keywords: Engineering::Computer science and engineering
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Goh, T. Q. (2022). VAE hyperparameter optimization in optical flow based OOD detection. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/162812
Abstract: Autonomous Vehicles (AVs) have developed greatly in terms of technology over the years. While AVs do not commit human errors, they are still able to misidentify images out of algorithmic errors or worse, due to malicious attacks. This is since AVs employ multiple machine learning models that are not guarded from adversarial attacks. Hence, over the years, out-of-distribution (OOD) algorithms are developed to combat these adversarial attacks. One of which is the Beta-Variational Optical Flow algorithm, which uses trained models to detect motion of objects in the horizontal and vertical planes. However, to train such models, multiple models are trained before the optimal model is derived. Hence, in this paper, we explore the hyperparameters in the Optical Flow algorithm to find a pattern such that future usage of the algorithm would take less time to train. In addition, we also explore edge cases in terms of hyperparameter tuning, to test assumptions that are made about Optical Flow algorithm performance. Lastly, Bayesian Optimization is also used to corroborate our results and provide new insights into the hyperparameter tuning.
URI: https://hdl.handle.net/10356/162812
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP_Report_1.pdf
  Restricted Access
1.03 MBAdobe PDFView/Open

Page view(s)

11
Updated on Dec 3, 2022

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.