Please use this identifier to cite or link to this item:
Full metadata record
DC FieldValueLanguage
dc.contributor.authorTan, Ezekiel Wei Renen_US
dc.identifier.citationTan, E. W. R. (2022). Study on attacks against federated learning. Final Year Project (FYP), Nanyang Technological University, Singapore.
dc.description.abstractFederated learning is a decentralised form of machine learning, offering the benefits of large amounts of user data across multiple entities, but in a way that user data do not have to change hands. As data privacy concerns become more prevalent, and laws become more widespread, federated learning is expected to be more widely adopted as an effective form of artificial intelligence for technological solutions. The increased incentive for attacking federated networks, combined with the inherent security risks associated with decentralised technologies, mean that attacks on federated networks will become more commonplace in the future. This project studies attacks on federated learning networks by finding the best attack vectors towards such models, to understand where and how they are vulnerable, with the intent of providing insights on how to build defences against those attacks. Open source libraries were used to explore pixel and semantic attacks, centralised and distributed attacks, as well as single and multi shot attacks.en_US
dc.publisherNanyang Technological Universityen_US
dc.subjectEngineering::Computer science and engineering::Computing methodologies::Artificial intelligenceen_US
dc.titleStudy on attacks against federated learningen_US
dc.typeFinal Year Project (FYP)en_US
dc.contributor.supervisorYeo Chai Kiaten_US
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.description.degreeBachelor of Engineering (Computer Science)en_US
item.fulltextWith Fulltext-
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)
Files in This Item:
File Description SizeFormat 
  Restricted Access
2.88 MBAdobe PDFView/Open

Page view(s)

Updated on Dec 2, 2023


Updated on Dec 2, 2023

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.