Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/156516
Title: | Evaluation of adversarial attacks against deep learning models | Authors: | Ta, Anh Duc | Keywords: | Engineering::Computer science and engineering | Issue Date: | 2022 | Publisher: | Nanyang Technological University | Source: | Ta, A. D. (2022). Evaluation of adversarial attacks against deep learning models. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/156516 | Project: | SCSE21-0250 | Abstract: | The rapid development of deep learning techniques has made them useful in many applications. However, recent studies have shown that deep learning algorithms can be vulnerable to adversarial attacks. This is a serious concern when considering these algorithms for safety-critical applications. To further improve the defense of deep learning algorithm, there is a need to study the threats of adversarial attacks. In this project, the effectiveness of adversarial attacks on deep learning models was evaluated under different criteria like different attack methods, different deep learning model structures and different deep learning tasks. The result of the experiment showed that the effectiveness of the attacks depended on the type of the attack, the source model structure, and the target model structure. Moreover, the result indicated that adversarial training is not the best defense technique against all types of attack methods. Furthermore, the report also showed that effectiveness of adversarial examples is not limited to Computer Vision tasks only but also to Audio Examples Classification. | URI: | https://hdl.handle.net/10356/156516 | Schools: | School of Computer Science and Engineering | Fulltext Permission: | restricted | Fulltext Availability: | With Fulltext |
Appears in Collections: | SCSE Student Reports (FYP/IA/PA/PI) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
SCSE21-0250_U1820371K_FYP-Final-Report.pdf Restricted Access | 729.03 kB | Adobe PDF | View/Open |
Page view(s)
146
Updated on Dec 10, 2023
Download(s) 50
45
Updated on Dec 10, 2023
Google ScholarTM
Check
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.