Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/171944
Title: Adversarial example construction against autonomous vehicles
Authors: Loh, Zhi Heng
Keywords: Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Issue Date: 2023
Publisher: Nanyang Technological University
Source: Loh, Z. H. (2023). Adversarial example construction against autonomous vehicles. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/171944
Abstract: With autonomous vehicles (AVs) approaching widespread adoption, there is a need to emphasize safety as it must not be neglected. Touted to be free from errors commonly made by humans, they are nevertheless not immune to attacks with malicious intent. In general, AVs utilize a variety of machine-learning models and sensors to help them understand their environment. However, based on past research on machine learning models, it is understood that they may be susceptible to adversarial attacks. In this paper, Daedalus, an attack algorithm that exploits the vulnerability in Non-Maximum Suppression (NMS) is used to generate adversarial examples using a surrogate model. The perturbations on the images are nearly imperceptible. The generated images are subsequently evaluated against the Single-Stage Monocular 3D Object Detection via Key Point Estimation [1] (SMOKE) utilized in Baidu Apollo’s Autonomous Driving System for camera-based object detection. In addition, look into potential mitigations that could be implemented to mitigate Daedalus.
URI: https://hdl.handle.net/10356/171944
Schools: School of Computer Science and Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
ADVERSARIAL EXAMPLE CONSTRUCTION AGAINST AUTONOMOUS VEHICLES_3.pdf
  Restricted Access
Undergraduate project report1.8 MBAdobe PDFView/Open

Page view(s)

225
Updated on Jun 12, 2024

Download(s)

7
Updated on Jun 12, 2024

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.