Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/137946
Full metadata record
DC FieldValueLanguage
dc.contributor.authorYip, Lionell En Zhien_US
dc.date.accessioned2020-04-20T05:59:43Z-
dc.date.available2020-04-20T05:59:43Z-
dc.date.issued2020-
dc.identifier.urihttps://hdl.handle.net/10356/137946-
dc.description.abstractPrevalent use of Neural Networks for Classification Tasks has brought to attention the security and integrity of the Neural Networks that industries are so reliant on. Adversarial examples are conspicuous to humans, but neural networks struggle to correctly classify images with the presence of adversarial perturbations. I introduce a framework for understanding how neural networks perceive inputs, and its relation to adversarial attack methods. I demonstrate that there is no correlation between the region of importance and the region of attack. I demonstrate that a frequently perturbed region of an adversarial example across a class in a data-set exists. I attempt to improve classification performance by exploiting the differences of input and adversarial attack, and I demonstrate a novel augmentation method for improving prediction performance of adversarial samples.en_US
dc.language.isoenen_US
dc.publisherNanyang Technological Universityen_US
dc.relationSCSE19-0306en_US
dc.subjectEngineering::Computer science and engineering::Computing methodologies::Artificial intelligenceen_US
dc.titleDemystifying adversarial attacks on neural networksen_US
dc.typeFinal Year Project (FYP)en_US
dc.contributor.supervisorAnupam Chattopadhyayen_US
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.description.degreeBachelor of Engineering (Computer Science)en_US
dc.contributor.researchParallel and Distributed Computing Centreen_US
dc.contributor.supervisoremailanupam@ntu.edu.sgen_US
item.grantfulltextrestricted-
item.fulltextWith Fulltext-
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)
Files in This Item:
File Description SizeFormat 
NTU_SCSE19-0306-U1721954J.pdf
  Restricted Access
2.4 MBAdobe PDFView/Open

Page view(s)

295
Updated on Feb 27, 2024

Download(s) 50

61
Updated on Feb 27, 2024

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.