Please use this identifier to cite or link to this item:
https://hdl.handle.net/10356/162924
Title: | Preventing catastrophic forgetting in continual learning | Authors: | Ong, Yi Shen | Keywords: | Engineering::Computer science and engineering | Issue Date: | 2022 | Publisher: | Nanyang Technological University | Source: | Ong, Y. S. (2022). Preventing catastrophic forgetting in continual learning. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/162924 | Project: | SCSE21-0626 | Abstract: | Continual learning in neural networks has been receiving increased interest due to how prevalent machine learning is in an increasing number of industries. Catastrophic forgetting, which is when a model forgets old tasks upon learning new tasks, is still a major roadblock in allowing neural networks to be truly life-long learners. A series of tests were conducted on the effectiveness of using buffers filled with old training data as a way of mitigating forgetting by training them alongside new data. The results are that increasing the size of the buffer does help mitigate forgetting at the cost of increased space used. | URI: | https://hdl.handle.net/10356/162924 | Fulltext Permission: | restricted | Fulltext Availability: | With Fulltext |
Appears in Collections: | SCSE Student Reports (FYP/IA/PA/PI) |
Files in This Item:
File | Description | Size | Format | |
---|---|---|---|---|
FYP Report.pdf Restricted Access | 4.37 MB | Adobe PDF | View/Open |
Page view(s)
41
Updated on Jan 30, 2023
Download(s)
3
Updated on Jan 30, 2023
Google ScholarTM
Check
Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.