Please use this identifier to cite or link to this item:
Full metadata record
DC FieldValueLanguage
dc.contributor.authorOng, Yi Shenen_US
dc.identifier.citationOng, Y. S. (2022). Preventing catastrophic forgetting in continual learning. Final Year Project (FYP), Nanyang Technological University, Singapore.
dc.description.abstractContinual learning in neural networks has been receiving increased interest due to how prevalent machine learning is in an increasing number of industries. Catastrophic forgetting, which is when a model forgets old tasks upon learning new tasks, is still a major roadblock in allowing neural networks to be truly life-long learners. A series of tests were conducted on the effectiveness of using buffers filled with old training data as a way of mitigating forgetting by training them alongside new data. The results are that increasing the size of the buffer does help mitigate forgetting at the cost of increased space used.en_US
dc.publisherNanyang Technological Universityen_US
dc.subjectEngineering::Computer science and engineeringen_US
dc.titlePreventing catastrophic forgetting in continual learningen_US
dc.typeFinal Year Project (FYP)en_US
dc.contributor.supervisorLin Guoshengen_US
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.description.degreeBachelor of Engineering (Computer Science)en_US
item.fulltextWith Fulltext-
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)
Files in This Item:
File Description SizeFormat 
FYP Report.pdf
  Restricted Access
4.37 MBAdobe PDFView/Open

Page view(s)

Updated on Jan 31, 2023


Updated on Jan 31, 2023

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.