Please use this identifier to cite or link to this item:
Full metadata record
DC FieldValueLanguage
dc.contributor.authorChng, Charlotteen_US
dc.identifier.citationChng, C. (2021). Product review summarization. Final Year Project (FYP), Nanyang Technological University, Singapore.
dc.description.abstractThe need for automatic summarization systems that can condense a product review into a digestible summary is important to help consumers arrive at a purchasing decision quickly. In this paper, we will be focusing on abstractive summarization, which is a summarization technique that paraphrases, rather than copies important information in a text to create a summary. Since abstractive summarization is a relatively nascent field, pre-trained transformer models have not been widely utilized to produce product review summaries yet. This paper thus aims to apply pre-trained transformer models to address the oversight and generate more efficient abstractive summaries for product reviews. In our experiment, we used the publicly available Amazon fine food reviews dataset to fine-tune a Bidirectional Representation for Transformers (BERT) model that has been pre-trained on Yelp and a separate Amazon review dataset, as well as a Robustly optimized BERT approach (RoBERTa) model. We then compared their Recall-Oriented Understudy for Gisting Evaluation (ROUGE) scores with a transformer model that has been trained from scratch. Final results show that the pre-trained transformers, especially the RoBERTa model, outperform the transformer model that is trained from scratch, and manage to generate fairly efficient abstractive product review summaries.en_US
dc.publisherNanyang Technological Universityen_US
dc.subjectEngineering::Computer science and engineeringen_US
dc.titleProduct review summarizationen_US
dc.typeFinal Year Project (FYP)en_US
dc.contributor.supervisorSun Aixinen_US
dc.contributor.schoolSchool of Computer Science and Engineeringen_US
dc.description.degreeBachelor of Engineering (Computer Science)en_US
item.fulltextWith Fulltext-
Appears in Collections:SCSE Student Reports (FYP/IA/PA/PI)
Files in This Item:
File Description SizeFormat 
  Restricted Access
1.18 MBAdobe PDFView/Open

Page view(s)

Updated on Jun 22, 2022

Download(s) 50

Updated on Jun 22, 2022

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.