Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/166036
Title: | Adversarial attacks on deep learning | Authors: | Yee, An Qi | Keywords: | Engineering::Computer science and engineering Science::Mathematics |
Issue Date: | 2023 | Publisher: | Nanyang Technological University | Source: | Yee, A. Q. (2023). Adversarial attacks on deep learning. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/166036 | Project: | SCSE22-0150 | Abstract: | Deep learning models, especially convolutional neural networks (CNNs), have made significant progress in the field of image recognition and classification. However, adversarial attacks have emerged as a significant vulnerability, posing threats to the robustness of these models. One notable example is the one-pixel attack, which leads to incorrect predictions just by changing a single pixel, which could lead to potentially serious consequences. This project aims to investigate the efficiency and effectiveness of different search strategies in conducting the one- pixel attacks on black box networks. Certain adversarial attacks are explored before narrowing down to one pixel attack. This study will further explore the performance of three search algorithms - Genetic Algorithm (GA), Simulated Annealing (SA) and Differential Evolution (DE) - in terms of the computational power used, success rates and convergence speed. The aim of this study is to research on the effects of these algorithms on one pixel attack, hopefully achieving the goal to identify elements that improve the efficiency and efficacy of the one-pixel attack. | URI: | https://hdl.handle.net/10356/166036 | Schools: | School of Computer Science and Engineering | Fulltext Permission: | restricted | Fulltext Availability: | With Fulltext |
Appears in Collections: | SCSE Student Reports (FYP/IA/PA/PI) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FYP_Report_YeeAnQi.pdf Restricted Access | 2.48 MB | Adobe PDF | View/Open |
Page view(s)
243
Updated on May 7, 2025
Download(s) 50
25
Updated on May 7, 2025
Google ScholarTM
Check
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.