Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/148086
Title: FGSM attacks on traffic light recognition of the apollo autonomous driving system
Authors: Samuel, Milla
Keywords: Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Issue Date: 2021
Publisher: Nanyang Technological University
Source: Samuel, M. (2021). FGSM attacks on traffic light recognition of the apollo autonomous driving system. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/148086
Project: SCSE20-0069
Abstract: Autonomous vehicles rely on Autonomous Driving Systems (ADS) to control the car without human intervention. The ADS uses multiple sensors cameras to perceive the environment around the vehicle. These perception systems rely on machine learning models which are susceptible to adversarial attacks, in which a model’s input is intercepted and perturbations are added, causing models to make wrong predictions with very high confidence. We attempted the Fast Gradient Sign Method (FGSM) adversarial attack on the traffic light recognition module of the Baidu Apollo ADS in normal, bright, rainy and foggy conditions to test the robustness of the system against white-box adversarial attacks. While the model performed well against attacks in normal conditions, multiple attacks were able to fool the model to predict the wrong class with high confidence using almost imperceptible perturbations in bright and rainy conditions. This exposes a vulnerability of the Apollo system, in which the FGSM attack managed to exploit the linearity of the traffic light recognition model as well as pass through all the safeguards that Apollo had in place.
URI: https://hdl.handle.net/10356/148086
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
FYP_Report.pdf
  Restricted Access
1.99 MBAdobe PDFView/Open

Page view(s)

67
Updated on May 24, 2022

Download(s)

3
Updated on May 24, 2022

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.