Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/174146
Full metadata record
DC FieldValueLanguage
dc.contributor.authorHu, Bowenen_US
dc.contributor.authorHe, Weiyangen_US
dc.contributor.authorWang, Sien_US
dc.contributor.authorLiu, Wenyeen_US
dc.contributor.authorChang, Chip Hongen_US
dc.date.accessioned2024-06-06T03:03:36Z-
dc.date.available2024-06-06T03:03:36Z-
dc.date.issued2024-
dc.identifier.citationHu, B., He, W., Wang, S., Liu, W. & Chang, C. H. (2024). Live demonstration: man-in-the-middle attack on edge artificial intelligence. 2024 IEEE International Symposium on Circuits and Systems (ISCAS). https://dx.doi.org/10.1109/ISCAS58744.2024.10558371en_US
dc.identifier.urihttps://hdl.handle.net/10356/174146-
dc.description.abstractDeep neural networks (DNNs) are susceptible to evasion attacks. However, digital adversarial examples are typically applied to pre-captured static images. The perturbations are generated by loss optimization with knowledge of target model hyperparameters and are added offline. Physical adversarial examples, on the other hand, tamper with the physical target or use a realistically fabricated target to fool the DNN. A sufficient number of pristine target samples captured under different varying environmental conditions are required to create the physical adversarial perturbations. Both digital and physical input evasion attacks are not robust against dynamic object scene variations and the adversarial effects are often weakened by model reduction and quantization when the DNNs are implemented on edge artificial intelligence (AI) accelerator platforms. This demonstration presents a practical man-in-the-middle (MITM) attack on an edge DNN first reported in [1]. A tiny MIPI FPGA chip with hardened CSI-2 and D-PHY blocks is attached between the camera and the edge AI accelerator to inject unobtrusive stripes onto the RAW image data. The attack is less influenced by dynamic context variations such as changes in viewing angle, illumination, and distance of the target from the camera.en_US
dc.description.sponsorshipCyber Security Agencyen_US
dc.description.sponsorshipMinistry of Education (MOE)en_US
dc.description.sponsorshipNational Research Foundation (NRF)en_US
dc.language.isoenen_US
dc.relationNRF2018NCRNCR009-0001en_US
dc.relationMOET2EP50220-0003en_US
dc.rights© 2024 IEEE. All rights reserved. This article may be downloaded for personal use only. Any other use requires prior permission of the copyright holder. The Version of Record is available online at http://doi.org/10.1109/ISCAS58744.2024.10558371.en_US
dc.subjectEngineeringen_US
dc.titleLive demonstration: man-in-the-middle attack on edge artificial intelligenceen_US
dc.typeConference Paperen
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.contributor.conference2024 IEEE International Symposium on Circuits and Systems (ISCAS)en_US
dc.contributor.researchCentre for Integrated Circuits and Systemsen_US
dc.identifier.doi10.1109/ISCAS58744.2024.10558371-
dc.description.versionSubmitted/Accepted versionen_US
dc.identifier.urlhttps://2024.ieee-iscas.org/-
dc.subject.keywordsDeep neural networksen_US
dc.subject.keywordsEdge artificial intelligenceen_US
dc.citation.conferencelocationSingaporeen_US
dc.description.acknowledgementThis research is supported in part by the National Research Foundation, Singapore, and Cyber Security Agency of Singapore under its National Cy- bersecurity Research & Development Programme (Cyber-Hardware Forensic & Assurance Evaluation R&D Programme NRF2018NCRNCR009-0001) and in part by the Ministry of Education, Singapore, under its AcRF Tier 2 Award No. MOET2EP50220-0003.en_US
item.fulltextWith Fulltext-
item.grantfulltextopen-
Appears in Collections:EEE Conference Papers
Files in This Item:
File Description SizeFormat 
conference_101719.pdf418.85 kBAdobe PDFThumbnail
View/Open

Page view(s)

43
Updated on Jul 14, 2024

Download(s)

17
Updated on Jul 14, 2024

Google ScholarTM

Check

Altmetric


Plumx

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.