Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/184668
Title: | Is causality the solution to machine unlearning? | Authors: | Wu, Yufei | Keywords: | Computer and Information Science | Issue Date: | 2025 | Publisher: | Nanyang Technological University | Source: | Wu, Y. (2025). Is causality the solution to machine unlearning?. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/184668 | Abstract: | As the prevalence of AI models in everyday applications increases, the importance of data privacy and protection becomes paramount. Legislative frameworks such as the General Data Protection Regulation (GDPR) underscore the necessity for models to accommodate data unlearning requirements. However, the task of unlearning presents considerable challenges, exacerbated by the time-intensive nature of retraining with a huge dataset. Existing unlearning methods often struggle to effectively address normally distributed data, as they are primarily designed to handle polluted data or single-class instances. This thesis introduces a novel unlearning approach that incorporates causality to identify key forget samples and neurons, facilitating direct model weight adjustments without extensive reconfiguration. The proposed method is evaluated on metrics of runtime, accuracy, and privacy risk reduction, demonstrating its ability to efficiently and effectively unlearn large data subsets while preserving the model’s overall performance. | URI: | https://hdl.handle.net/10356/184668 | Schools: | College of Computing and Data Science | Rights: | This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0). | Fulltext Permission: | open | Fulltext Availability: | With Fulltext |
Appears in Collections: | CCDS Theses |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
NTU_thesis_revised.pdf | 1.1 MB | Adobe PDF | View/Open |
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.