Please use this identifier to cite or link to this item:
Title: Model extraction attack on Deep Neural Networks
Authors: Lkhagvadorj, Dulguun
Keywords: Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Engineering::Computer science and engineering::Mathematics of computing::Probability and statistics
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Lkhagvadorj, D. (2022). Model extraction attack on Deep Neural Networks. Final Year Project (FYP), Nanyang Technological University, Singapore.
Project: A2034-211 
Abstract: Machine learning models based on Deep Neural Networks (DNN) have gained popularity due to their promising performance and recent advancements in hardware. Development of high-performing DNN models requires a mass amount of time and resources, therefore, information regarding such models is kept undisclosed in commercial settings. Hence, as an attacker, obtaining details of such hidden models at a low cost would be beneficial both financially and timewise. In this project, we studied different methods to attack black-box DNN models and experimented with two different methods. The first method aims at developing a substitute model with similar performances as the target model by using the target model’s outputs as training data for the substitute model. The second method focuses on obtaining structural information of the target through a timing side-channel attack. This report includes the theoretical basis of the methods, details of implementations, results of experiments, and discussions of the advantages and shortcomings of each method.
Schools: School of Electrical and Electronic Engineering 
Research Centres: VIRTUS, IC Design Centre of Excellence 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Student Reports (FYP/IA/PA/PI)

Files in This Item:
File Description SizeFormat 
  Restricted Access
1 MBAdobe PDFView/Open

Page view(s)

Updated on Sep 21, 2023


Updated on Sep 21, 2023

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.