Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/162847
Title: | Study on attacks against federated learning | Authors: | Tan, Ezekiel Wei Ren | Keywords: | Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence | Issue Date: | 2022 | Publisher: | Nanyang Technological University | Source: | Tan, E. W. R. (2022). Study on attacks against federated learning. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/162847 | Project: | SCSE21-0897 | Abstract: | Federated learning is a decentralised form of machine learning, offering the benefits of large amounts of user data across multiple entities, but in a way that user data do not have to change hands. As data privacy concerns become more prevalent, and laws become more widespread, federated learning is expected to be more widely adopted as an effective form of artificial intelligence for technological solutions. The increased incentive for attacking federated networks, combined with the inherent security risks associated with decentralised technologies, mean that attacks on federated networks will become more commonplace in the future. This project studies attacks on federated learning networks by finding the best attack vectors towards such models, to understand where and how they are vulnerable, with the intent of providing insights on how to build defences against those attacks. Open source libraries were used to explore pixel and semantic attacks, centralised and distributed attacks, as well as single and multi shot attacks. | URI: | https://hdl.handle.net/10356/162847 | Schools: | School of Computer Science and Engineering | Fulltext Permission: | restricted | Fulltext Availability: | With Fulltext |
Appears in Collections: | SCSE Student Reports (FYP/IA/PA/PI) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FYP_Final_Report.pdf Restricted Access | 2.88 MB | Adobe PDF | View/Open |
Page view(s)
91
Updated on Sep 24, 2023
Download(s)
13
Updated on Sep 24, 2023
Google ScholarTM
Check
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.