Please use this identifier to cite or link to this item:
Full metadata record
DC FieldValueLanguage
dc.contributor.authorWong, Yung Shenen_US
dc.identifier.citationWong, Y. S. (2022). Generalized AutoNLP model for name entity recognition task. Final Year Project (FYP), Nanyang Technological University, Singapore.
dc.description.abstractUnsupervised pre-trained word embeddings have been widely used in recent studies in the field of Natural Language Processing. After the remarkable achievement obtained by the introduction of BERT in various NLP related tasks, studies had been more focused on deep-learning based approach to represent the raw input sequence of string words. However, there is an uncertainty of these deep-learning based approaches able to convey all the semantic meanings of words and have generalized ability on AutoNLP on name entity recognition related tasks. In this project, we have proposed an architecture of a combination of deep-learning based approach word embeddings, BERT with static word embeddings, GloVe. Experiments are conducted to study the performance of our proposed architecture with BERT word embeddings on AutoNLP name entity recognition tasks.en_US
dc.publisherNanyang Technological Universityen_US
dc.subjectEngineering::Computer science and engineering::Computing methodologies::Document and text processingen_US
dc.titleGeneralized AutoNLP model for name entity recognition tasken_US
dc.typeFinal Year Project (FYP)en_US
dc.contributor.supervisorSinno Jialin Panen_US
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.description.degreeBachelor of Engineering (Computer Science)en_US
item.fulltextWith Fulltext-
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)
Files in This Item:
File Description SizeFormat 
FYP_Report-Wong Yung Shen_Amended.pdf
  Restricted Access
1.34 MBAdobe PDFView/Open

Page view(s)

Updated on Sep 30, 2023

Download(s) 50

Updated on Sep 30, 2023

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.