Please use this identifier to cite or link to this item:
Title: A forward error compensation approach for fault resilient deep neural network accelerator design
Authors: Liu, Wenye
Chang, Chip Hong
Keywords: Engineering::Electrical and electronic engineering::Computer hardware, software and systems
Issue Date: 2021
Source: Liu, W. & Chang, C. H. (2021). A forward error compensation approach for fault resilient deep neural network accelerator design. 5th Workshop on Attacks and Solutions in Hardware Security (ASHES 2021), 41-50.
Project: CHFA-GC1-AW01
Abstract: Deep learning accelerator is a key enabler of a variety of safety-critical applications such as self-driving car and video surveillance. However, recently reported hardware-oriented attack vectors, e.g., fault injection attacks, have extended the threats on deployed deep neural network (DNN) systems beyond the software attack boundary by input data perturbation. Existing fault mitigation schemes including data masking, zeroing-on-error and circuit level time-borrowing techniques exploit the noise-tolerance of neural network models to resist random and sparse errors. Such noise tolerant-based schemes are not sufficiently effective to suppress intensive transient errors if a DNN accelerator is blasted with malicious and deliberate faults. In this paper, we conduct comprehensive investigations on reported resilient designs and propose a more robust countermeasure to fault injection attacks. The proposed design utilizes shadow flip flops for error detection and lightweight circuit for timely error correction. Our forward error compensation scheme rectifies the incorrect partial sum of the multiply-accumulation operation by estimating the difference between the correct and error-inflicted computation. The difference is added back to the final accumulated result at a later cycle without stalling the execution pipeline. We implemented our proposed design and the existing fault-mitigation schemes on the same Intel FPGA-based DNN accelerator to demonstrate its substantially enhanced resiliency against deliberate fault attacks on two popular DNN models, ResNet50 and VGG16, trained with ImageNet.
ISBN: 9781450386623
DOI: 10.1145/3474376.3487281
Rights: © 2021 The Owner/Author(s). Publication rights licensed to ACM. All rights reserved. This paper was published in Proceedings of the 5th Workshop on Attacks and Solutions in Hardware Security (ASHES 2021) and is made available with permission of The Owner/Author(s).
Fulltext Permission: open
Fulltext Availability: With Fulltext
Appears in Collections:EEE Conference Papers

Files in This Item:
File Description SizeFormat 
sample-sigconf.pdf773.04 kBAdobe PDFView/Open

Page view(s)

Updated on Dec 6, 2022

Download(s) 50

Updated on Dec 6, 2022

Google ScholarTM




Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.