Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/172002
Title: | Developing AI attacks/defenses | Authors: | Lim, Noel Wee Tat | Keywords: | Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence | Issue Date: | 2023 | Publisher: | Nanyang Technological University | Source: | Lim, N. W. T. (2023). Developing AI attacks/defenses. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/172002 | Project: | SCSE22-0834 | Abstract: | Deep Neural Networks (DNNs) serve as a fundamental pillar in the realms of Artificial Intelligence (AI) and Machine Learning (ML), playing a pivotal role in advancing these fields. They are computational models inspired by the human brain and are designed to process information and make decisions in a way that resembles human thinking. This has led to their remarkable success in various applications, from image and speech recognition to natural language processing and autonomous systems. Alongside these potentials and capabilities, DNNs have also unveiled vulnerabilities, one of them being adversarial attacks which have been proven to be catastrophic against DNNs and have received broad attention in recent years. This raises concerns over the robustness and security of DNNs. This project is mainly to conduct a comprehensive study on DNNs and adversarial attacks, and to implement specific techniques within DNNs aimed at bolstering their robustness. | URI: | https://hdl.handle.net/10356/172002 | Schools: | School of Computer Science and Engineering | Fulltext Permission: | restricted | Fulltext Availability: | With Fulltext |
Appears in Collections: | SCSE Student Reports (FYP/IA/PA/PI) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FYP SCSE22-0834 Final Report.pdf Restricted Access | Undergraduate project report | 2.24 MB | Adobe PDF | View/Open |
Page view(s)
118
Updated on Mar 24, 2025
Download(s) 50
24
Updated on Mar 24, 2025
Google ScholarTM
Check
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.