Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/136863
Title: Vulnerability analysis on noise-injection based hardware attack on deep neural networks
Authors: Liu, Wenye
Wang, Si
Chang, Chip-Hong
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2019
Source: Liu, W., Wang, S. & Chang, C.-H. (2019). Vulnerability analysis on noise-injection based hardware attack on deep neural networks. 2019 Asian Hardware Oriented Security and Trust Symposium (AsianHOST).
Project: MOE-2015-T2-2-013 
Conference: 2019 Asian Hardware Oriented Security and Trust Symposium (AsianHOST)
Abstract: Despite superior accuracy on most vision recognition tasks, deep neural networks are susceptible to adversarial examples. Recent studies show that adding carefully crafted small perturbations on input layer can mislead a classifier into arbitrary categories. However, most adversarial attack algorithms only concentrate on the inputs of the model, effect of tampering internal nodes is seldom studied. Adversarial attack, if extends to deployed hardware system, can perturb or alter intermediate data during real time processing. To investigate the vulnerability implication of deep neural network hardware under potential adversarial attacks, we comprehensively evaluate 10 popular DNN models by injecting noise into each layer of these models. Our experimental results indicate that more accurate networks are more prone to disturbance of selective internal layers. For traditional convolutional network structures (AlexNet and VGG family), the last convolution layer is most assailable. For state-of-the-art architectures (Inception, ResNet and DenseNet families), as little as 0.1\% or one element per channel of perturbations can subvert the original predictions, and over 65\% of computational layers suffer from this vulnerability. Our findings reveal that optimization of accuracy, model size and computational efficiency can unconsciously sacrifice the robustness of deep learning system.
URI: https://hdl.handle.net/10356/136863
Schools: School of Electrical and Electronic Engineering 
Research Centres: Centre for Integrated Circuits and Systems 
Rights: © 2019 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works.
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:EEE Conference Papers

Files in This Item:
File Description SizeFormat 
AsianHOST2019-LW.pdf412.95 kBAdobe PDFThumbnail
View/Open

Page view(s)

230
Updated on Mar 1, 2024

Download(s) 20

175
Updated on Mar 1, 2024

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.