Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/184133
Title: Protecting deep learning algorithm from adversarial attacks
Authors: Oh, Ding Ang
Keywords: Computer and Information Science
Issue Date: 2025
Publisher: Nanyang Technological University
Source: Oh, D. A. (2025). Protecting deep learning algorithm from adversarial attacks. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/184133
Abstract: Deep Neural Networks (DNNs) are susceptible to adversarial examples where small imperceptible changes, known as perturbations, are introduced to legitimate inputs, causing erroneous misclassifications. With the existence of side-channel attacks that can sniff out model architecture, stronger targeted adversarial attacks can be employed increasing the success rate of such attacks, leaving DNNs particularly vulnerable. Current detection methods do not generalise well and may be model-agnostic, producing unreliable results in different conditions. In this work, we investigate the use of Hardware Performance Counters (HPC) as a lightweight and hardware-level indicator for distinguishing between adversarial and legitimate, benign inputs. Hardware level metrics such as CPU Cycles, Cach Miss, Cache Accessed, Branch, Branch Miss and Instruction were collected via perf tool during inference of a modified ResNet18 model trained on Imagenette dataset, in hopes of identifying prominent features. Various binary Classifiers such as Random Forest and XGBoost were trained based on the collected features. Although our findings did not provide conclusive evidence of a reliable detection method, insights into the challenges of using HPC for adversarial defence were provided. We discuss potential limitations, including noise in HPC measurement and the proposed approach, and suggest directions for future related works that uses HPC for adversarial robustness.
URI: https://hdl.handle.net/10356/184133
Schools: College of Computing and Data Science 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:CCDS Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
OhDingAng_FYP_Report.pdf
  Restricted Access
1.24 MBAdobe PDFView/Open

Page view(s)

46
Updated on May 7, 2025

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.