Please use this identifier to cite or link to this item: https://hdl.handle.net/10356/165534
Full metadata record
DC FieldValueLanguage
dc.contributor.authorYi, Chenqien_US
dc.date.accessioned2023-03-29T01:00:07Z-
dc.date.available2023-03-29T01:00:07Z-
dc.date.issued2023-
dc.identifier.citationYi, C. (2023). Abstractive summarization framework based on pre-training and contrastive learning. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/165534en_US
dc.identifier.urihttps://hdl.handle.net/10356/165534-
dc.description.abstractAbstractive summarization aims at generating sentences which can well cover the key information of the document. In this dissertation, we verify the effectiveness of a generation-evaluation model trained with contrastive learning, which generates a set of candidate summaries first and then evaluates the candidates to select the best one. Conventional methods directly introduce pre-trained models by default as the backbone of summary evaluation model. However, what pre-training task is helpful for improving the performance of pre-trained models on downstream summary evaluation task is still an open question. We conduct a study on Inverse Cloze Task (ICT) to answer the question. For the backbone of evaluation model, we compare the results of different pre-trained models. We further adopt ICT as additional pre-training task to pre-train the model and utilize it as the backbone of the evaluation model. We also verify and analyze how the masking rate in ICT affects the downstream evaluation task. Experiments on XSum and CNN/Daily Mail show that the model with additional ICT pre-training outperforms other pre-training baselines.en_US
dc.language.isoenen_US
dc.publisherNanyang Technological Universityen_US
dc.subjectEngineering::Electrical and electronic engineeringen_US
dc.titleAbstractive summarization framework based on pre-training and contrastive learningen_US
dc.typeThesis-Master by Courseworken_US
dc.contributor.supervisorLihui Chenen_US
dc.contributor.schoolSchool of Electrical and Electronic Engineeringen_US
dc.description.degreeMaster of Science (Signal Processing)en_US
dc.contributor.supervisoremailELHCHEN@ntu.edu.sgen_US
item.fulltextWith Fulltext-
item.grantfulltextrestricted-
Appears in Collections:EEE Theses
Files in This Item:
File Description SizeFormat 
Abstractive summarization framework based on pre-training and contrastive learning.pdf
  Restricted Access
1.13 MBAdobe PDFView/Open

Page view(s)

191
Updated on Jun 14, 2024

Download(s)

15
Updated on Jun 14, 2024

Google ScholarTM

Check

Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.