Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/158375
Title: Model extraction attack on Deep Neural Networks
Authors: Lkhagvadorj, Dulguun
Keywords: Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Engineering::Computer science and engineering::Mathematics of computing::Probability and statistics
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Lkhagvadorj, D. (2022). Model extraction attack on Deep Neural Networks. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/158375
Project: A2034-211 
Abstract: Machine learning models based on Deep Neural Networks (DNN) have gained popularity due to their promising performance and recent advancements in hardware. Development of high-performing DNN models requires a mass amount of time and resources, therefore, information regarding such models is kept undisclosed in commercial settings. Hence, as an attacker, obtaining details of such hidden models at a low cost would be beneficial both financially and timewise. In this project, we studied different methods to attack black-box DNN models and experimented with two different methods. The first method aims at developing a substitute model with similar performances as the target model by using the target model’s outputs as training data for the substitute model. The second method focuses on obtaining structural information of the target through a timing side-channel attack. This report includes the theoretical basis of the methods, details of implementations, results of experiments, and discussions of the advantages and shortcomings of each method.
URI: https://hdl.handle.net/10356/158375
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
Lkhagvadorj_Dulguun_FYP_Final_Report_revised_reduced.pdf
  Restricted Access
1 MBAdobe PDFView/Open

Page view(s)

28
Updated on Dec 1, 2022

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.