Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/156095
Full metadata record
DC Field | Value | Language |
---|---|---|
dc.contributor.author | Hou, Xiaolu | en_US |
dc.contributor.author | Breier, Jakub | en_US |
dc.contributor.author | Jap, Dirmanto | en_US |
dc.contributor.author | Ma, Lei | en_US |
dc.contributor.author | Bhasin, Shivam | en_US |
dc.contributor.author | Liu, Yang | en_US |
dc.date.accessioned | 2022-04-05T08:45:27Z | - |
dc.date.available | 2022-04-05T08:45:27Z | - |
dc.date.issued | 2021 | - |
dc.identifier.citation | Hou, X., Breier, J., Jap, D., Ma, L., Bhasin, S. & Liu, Y. (2021). Physical security of deep learning on edge devices : comprehensive evaluation of fault injection attack vectors. Microelectronics Reliability, 120, 114116-. https://dx.doi.org/10.1016/j.microrel.2021.114116 | en_US |
dc.identifier.issn | 0026-2714 | en_US |
dc.identifier.uri | https://hdl.handle.net/10356/156095 | - |
dc.description.abstract | Decision making tasks carried out by the usage of deep neural networks are successfully taking over in many areas, including those that are security critical, such as healthcare, transportation, smart grids, where intentional and unintentional failures can be disastrous. Edge computing systems are becoming ubiquitous nowadays, often serving deep learning tasks that do not need to be sent over to servers. Therefore, there is a necessity to evaluate the potential attacks that can target deep learning in the edge. In this work, we present evaluation of deep neural networks (DNNs) reliability against fault injection attacks. We first experimentally evaluate DNNs implemented in an embedded device by using laser fault injection to get the insight on possible attack vectors. We show practical results on four activation functions, ReLu, softmax, sigmoid, and tanh. We then perform a deep study on DNNs based on derived fault models by using several different attack strategies based on random faults. We also investigate a powerful attacker who can find effective fault location based on genetic algorithm, to show the most efficient attacks in terms of misclassification success rates. Finally, we show how a state of the art countermeasure against model extraction attack can be bypassed with a fault attack. Our results can serve as a basis to outline the susceptibility of DNNs to physical attacks which can be considered a viable attack vector whenever a device is deployed in hostile environment. | en_US |
dc.description.sponsorship | National Research Foundation (NRF) | en_US |
dc.language.iso | en | en_US |
dc.relation | NRF2018NCR-NCR009- 0001 | en_US |
dc.relation.ispartof | Microelectronics Reliability | en_US |
dc.rights | © 2021 Elsevier Ltd. All rights reserved. This paper was published in Microelectronics Reliability and is made available with permission of Elsevier Ltd. | en_US |
dc.subject | Engineering::Computer science and engineering::Hardware::Performance and reliability | en_US |
dc.subject | Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence | en_US |
dc.title | Physical security of deep learning on edge devices : comprehensive evaluation of fault injection attack vectors | en_US |
dc.type | Journal Article | en |
dc.contributor.research | Temasek Laboratories @ NTU | en_US |
dc.identifier.doi | 10.1016/j.microrel.2021.114116 | - |
dc.description.version | Submitted/Accepted version | en_US |
dc.identifier.scopus | 2-s2.0-85104295214 | - |
dc.identifier.volume | 120 | en_US |
dc.identifier.spage | 114116 | en_US |
dc.subject.keywords | Fault Attack | en_US |
dc.subject.keywords | Neural Network | en_US |
dc.description.acknowledgement | This research is supported in parts by the National Research Foundation, Singapore, under its National Cybersecurity Research & Development Programme / Cyber-Hardware Forensic & Assurance Evaluation R&D Programme (Award: NRF2018NCR-NCR009- 0001) | en_US |
item.fulltext | With Fulltext | - |
item.grantfulltext | open | - |
Appears in Collections: | TL Journal Articles |
SCOPUSTM
Citations
50
10
Updated on Mar 24, 2024
Web of ScienceTM
Citations
20
6
Updated on Oct 26, 2023
Page view(s)
197
Updated on Mar 28, 2024
Download(s)
22
Updated on Mar 28, 2024
Google ScholarTM
Check
Altmetric
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.