Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/156095
Full metadata record
DC FieldValueLanguage
dc.contributor.authorHou, Xiaoluen_US
dc.contributor.authorBreier, Jakuben_US
dc.contributor.authorJap, Dirmantoen_US
dc.contributor.authorMa, Leien_US
dc.contributor.authorBhasin, Shivamen_US
dc.contributor.authorLiu, Yangen_US
dc.date.accessioned2022-04-05T08:45:27Z-
dc.date.available2022-04-05T08:45:27Z-
dc.date.issued2021-
dc.identifier.citationHou, X., Breier, J., Jap, D., Ma, L., Bhasin, S. & Liu, Y. (2021). Physical security of deep learning on edge devices : comprehensive evaluation of fault injection attack vectors. Microelectronics Reliability, 120, 114116-. https://dx.doi.org/10.1016/j.microrel.2021.114116en_US
dc.identifier.issn0026-2714en_US
dc.identifier.urihttps://hdl.handle.net/10356/156095-
dc.description.abstractDecision making tasks carried out by the usage of deep neural networks are successfully taking over in many areas, including those that are security critical, such as healthcare, transportation, smart grids, where intentional and unintentional failures can be disastrous. Edge computing systems are becoming ubiquitous nowadays, often serving deep learning tasks that do not need to be sent over to servers. Therefore, there is a necessity to evaluate the potential attacks that can target deep learning in the edge. In this work, we present evaluation of deep neural networks (DNNs) reliability against fault injection attacks. We first experimentally evaluate DNNs implemented in an embedded device by using laser fault injection to get the insight on possible attack vectors. We show practical results on four activation functions, ReLu, softmax, sigmoid, and tanh. We then perform a deep study on DNNs based on derived fault models by using several different attack strategies based on random faults. We also investigate a powerful attacker who can find effective fault location based on genetic algorithm, to show the most efficient attacks in terms of misclassification success rates. Finally, we show how a state of the art countermeasure against model extraction attack can be bypassed with a fault attack. Our results can serve as a basis to outline the susceptibility of DNNs to physical attacks which can be considered a viable attack vector whenever a device is deployed in hostile environment.en_US
dc.description.sponsorshipNational Research Foundation (NRF)en_US
dc.language.isoenen_US
dc.relationNRF2018NCR-NCR009- 0001en_US
dc.relation.ispartofMicroelectronics Reliabilityen_US
dc.rights© 2021 Elsevier Ltd. All rights reserved. This paper was published in Microelectronics Reliability and is made available with permission of Elsevier Ltd.en_US
dc.subjectEngineering::Computer science and engineering::Hardware::Performance and reliabilityen_US
dc.subjectEngineering::Computer science and engineering::Computing methodologies::Artificial intelligenceen_US
dc.titlePhysical security of deep learning on edge devices : comprehensive evaluation of fault injection attack vectorsen_US
dc.typeJournal Articleen
dc.contributor.researchTemasek Laboratories @ NTUen_US
dc.identifier.doi10.1016/j.microrel.2021.114116-
dc.description.versionSubmitted/Accepted versionen_US
dc.identifier.scopus2-s2.0-85104295214-
dc.identifier.volume120en_US
dc.identifier.spage114116en_US
dc.subject.keywordsFault Attacken_US
dc.subject.keywordsNeural Networken_US
dc.description.acknowledgementThis research is supported in parts by the National Research Foundation, Singapore, under its National Cybersecurity Research & Development Programme / Cyber-Hardware Forensic & Assurance Evaluation R&D Programme (Award: NRF2018NCR-NCR009- 0001)en_US
item.fulltextWith Fulltext-
item.grantfulltextopen-
Appears in Collections:TL Journal Articles
Files in This Item:
File Description SizeFormat 
docs.pdf512.46 kBAdobe PDFThumbnail
View/Open

SCOPUSTM   
Citations 50

10
Updated on Mar 24, 2024

Web of ScienceTM
Citations 20

6
Updated on Oct 26, 2023

Page view(s)

197
Updated on Mar 28, 2024

Download(s)

22
Updated on Mar 28, 2024

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.