Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/78688
Title: Pre-training model based on the transfer learning in natural language processing
Authors: Tang, Jiayi
Keywords: Engineering::Electrical and electronic engineering
Issue Date: 2019
Abstract: Transfer learning is to apply knowledge or patterns learned in a particular field or task to different but related areas or problem. It is very prominent in terms of scarcity of data and heterogeneity of domain distribution. In the field of natural language processing, transfer learning is embodied in the pre-training model. There are two existing strategies for applying pre-trained language representations to downstream tasks: feature-based (ELMO) and fine-tuning (GPT、BERT). In 2018, Google released a large-scale pre-training language model BERT, which stands for Bidirectional Encoder Representations from Transformer. Compared with other pre-training model ELMO and GPT, and the classical model CNN, BERT is the latest and best-performing model up until now. Its highlights are (1) Bidirectional Transformer (2) Mask-Language Model (3) Next Sentence Prediction (4) A more general input layer and output layer. BERT model can efficiently learn text information and apply it to various NLP tasks. In this report, we use the BERT model in two way. The first is to use the pre-training model released by Google directly and then pass the fine-tuning stage. The second is to use the BERT-as-service to use the BERT model as a sentence encode followed by a DNN classifier. Then we horizontally compare BERT with ELMO and GPT; then vertically compare BERT with different parameters.
URI: http://hdl.handle.net/10356/78688
Schools: School of Electrical and Electronic Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
PRE-TRAINING MODEL BASED ON THE TRANSFER LEARNING IN NATURAL LANGUAGE PROCESSING.pdf
  Restricted Access
Main article4.85 MBAdobe PDFView/Open

Page view(s)

294
Updated on Jun 20, 2024

Download(s)

15
Updated on Jun 20, 2024

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.