Please use this identifier to cite or link to this item:
Full metadata record
DC FieldValueLanguage
dc.contributor.authorGuo, Feiyanen_US
dc.identifier.citationGuo, F. (2022). Study on attacks against federated learning. Final Year Project (FYP), Nanyang Technological University, Singapore.
dc.description.abstractWith the rise of artificial intelligence, the need for data also increases. However, many strict data privacy laws were put in place to protect personal data from being leaked. Therefore, this greatly limited the usage of artificial intelligence. Federated learning is a new form of collaborative machine learning that leverages on decentralized data for training models. This introduces the possibility of being exposed to poisoned data from malicious participants. In this project, the author explores different attack and defence methodologies to get a better understanding of how federated learning works. The focus is on the coordinated backdoor attack with model-dependant triggers for attack methodology and robust learning rates for defence methodology. The defence methodology is implemented into an opensourced federated learning base code. This will allow federated learning to be more widely used since it is less likely to be compromised by malicious attackers in the presence of built-in defences.en_US
dc.publisherNanyang Technological Universityen_US
dc.subjectEngineering::Computer science and engineering::Computing methodologies::Artificial intelligenceen_US
dc.titleStudy on attacks against federated learningen_US
dc.typeFinal Year Project (FYP)en_US
dc.contributor.supervisorYeo Chai Kiaten_US
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.description.degreeBachelor of Engineering (Computer Science)en_US
item.fulltextWith Fulltext-
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)
Files in This Item:
File Description SizeFormat 
  Restricted Access
1.09 MBAdobe PDFView/Open

Page view(s)

Updated on Apr 14, 2024

Download(s) 50

Updated on Apr 14, 2024

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.