Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/137926
Full metadata record
DC FieldValueLanguage
dc.contributor.authorLoi, Chii Leken_US
dc.date.accessioned2020-04-20T00:31:49Z-
dc.date.available2020-04-20T00:31:49Z-
dc.date.issued2020-
dc.identifier.urihttps://hdl.handle.net/10356/137926-
dc.description.abstractAutomatic Speech Recognition (ASR) systems have been growing in prevalence together with the advancement in deep learning. Built within many Intelligent Voice Control (IVC) systems such as Alexa, Siri and Google Assistant, ASR has become an attractive target for adversarial attacks. In this research project, the objective is to create a black-box over-the-air (OTA) attack system that can mutate an audio into its adversarial form with imperceptible difference, such that it will be interpreted as the targeted word by the ASR. In this paper, we demonstrate the feasibility and effectiveness of such an attack system in generating perturbation for the DeepSpeech ASR.en_US
dc.language.isoenen_US
dc.publisherNanyang Technological Universityen_US
dc.relationSCSE 19-0319en_US
dc.subjectEngineering::Computer science and engineering::Computing methodologies::Artificial intelligenceen_US
dc.subjectEngineering::Computer science and engineering::Computing methodologies::Simulation and modelingen_US
dc.titleAdversarial attacks on RNN-based deep learning systemsen_US
dc.typeFinal Year Project (FYP)en_US
dc.contributor.supervisorLiu Yangen_US
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.description.degreeBachelor of Engineering (Computer Science)en_US
dc.contributor.supervisoremailyangliu@ntu.edu.sgen_US
item.grantfulltextrestricted-
item.fulltextWith Fulltext-
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)
Files in This Item:
File Description SizeFormat 
Final Report.pdf
  Restricted Access
1.04 MBAdobe PDFView/Open

Page view(s)

345
Updated on Apr 27, 2025

Download(s) 50

33
Updated on Apr 27, 2025

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.