Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/151686
Title: Privacy-preserving deep learning
Authors: Yik, Jia Ler
Tan, Zhen Yuan
Zaw, Maw Htun
Keywords: Engineering::Computer science and engineering
Issue Date: 2021
Publisher: Nanyang Technological University
Source: Yik, J. L., Tan, Z. Y. & Zaw, M. H. (2021). Privacy-preserving deep learning. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/151686
Abstract: Data is coined to be the new oil due to the increasing awareness of its value in a myriad of applications running the gamut from automating personalised services to artificial intelligence - all of which with machine learning (ML) at their core. With this rising trend, there is growing attention on privacy by consumers and government bodies; this introduces the need for Federated Learning (FL) and Differential Privacy (DP) - an evolved form of ML, where models are trained while privacy is safeguarded - which forms the focus of our research. We visited existing research developments in privacy-preserving deep learning applications on structured and unstructured data and designed a proof-of-concept platform for the same, in the form of a Convolutional Neural Network for MNIST dataset handwritten digits hosted on the Cloud. Our experiment structure tested the different permutations between the degree of training in models, determined by the number of epochs per generation, and whether DP was implemented. In particular, our findings indicated the following: 1, adding noise to trained weights resulted in an overall decrease in trained accuracy but greater epsilon value; 2, larger locally trained accuracy for a larger epoch run presented itself with a larger accuracy drop; 3, lower final validation accuracy was achieved for DP models; 4, there was a low correlation between final validation accuracy values with standard deviation regardless of DP model. Further research can be conducted on the differing FL structures and centrality. Although FL is relatively new, there is strong evidence to suggest a growing interest and attention towards it. We hold the opinion that FL has a place in collaborative ML-based applications while preserving the privacy of end-users.
URI: https://hdl.handle.net/10356/151686
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:Renaissance Capstone Project (RCP)

Files in This Item:
File Description SizeFormat 
Privacy-Preserving Deep Learning paper.pdf
  Restricted Access
2.52 MBAdobe PDFView/Open

Page view(s)

214
Updated on Jan 27, 2023

Download(s)

14
Updated on Jan 27, 2023

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.