Product review summarization

The need for automatic summarization systems that can condense a product review into a digestible summary is important to help consumers arrive at a purchasing decision quickly. In this paper, we will be focusing on abstractive summarization, which is a summarization technique that paraphrases, rath...

Full description

Saved in:
Bibliographic Details
Main Author: Chng, Charlotte
Other Authors: Sun Aixin
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2021
Subjects:
Online Access:https://hdl.handle.net/10356/153434
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-153434
record_format dspace
spelling sg-ntu-dr.10356-1534342021-12-02T06:10:57Z Product review summarization Chng, Charlotte Sun Aixin School of Computer Science and Engineering AXSun@ntu.edu.sg Engineering::Computer science and engineering The need for automatic summarization systems that can condense a product review into a digestible summary is important to help consumers arrive at a purchasing decision quickly. In this paper, we will be focusing on abstractive summarization, which is a summarization technique that paraphrases, rather than copies important information in a text to create a summary. Since abstractive summarization is a relatively nascent field, pre-trained transformer models have not been widely utilized to produce product review summaries yet. This paper thus aims to apply pre-trained transformer models to address the oversight and generate more efficient abstractive summaries for product reviews. In our experiment, we used the publicly available Amazon fine food reviews dataset to fine-tune a Bidirectional Representation for Transformers (BERT) model that has been pre-trained on Yelp and a separate Amazon review dataset, as well as a Robustly optimized BERT approach (RoBERTa) model. We then compared their Recall-Oriented Understudy for Gisting Evaluation (ROUGE) scores with a transformer model that has been trained from scratch. Final results show that the pre-trained transformers, especially the RoBERTa model, outperform the transformer model that is trained from scratch, and manage to generate fairly efficient abstractive product review summaries. Bachelor of Engineering (Computer Science) 2021-12-02T06:10:57Z 2021-12-02T06:10:57Z 2021 Final Year Project (FYP) Chng, C. (2021). Product review summarization. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/153434 https://hdl.handle.net/10356/153434 en SCSE20-0951 application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Computer science and engineering
spellingShingle Engineering::Computer science and engineering
Chng, Charlotte
Product review summarization
description The need for automatic summarization systems that can condense a product review into a digestible summary is important to help consumers arrive at a purchasing decision quickly. In this paper, we will be focusing on abstractive summarization, which is a summarization technique that paraphrases, rather than copies important information in a text to create a summary. Since abstractive summarization is a relatively nascent field, pre-trained transformer models have not been widely utilized to produce product review summaries yet. This paper thus aims to apply pre-trained transformer models to address the oversight and generate more efficient abstractive summaries for product reviews. In our experiment, we used the publicly available Amazon fine food reviews dataset to fine-tune a Bidirectional Representation for Transformers (BERT) model that has been pre-trained on Yelp and a separate Amazon review dataset, as well as a Robustly optimized BERT approach (RoBERTa) model. We then compared their Recall-Oriented Understudy for Gisting Evaluation (ROUGE) scores with a transformer model that has been trained from scratch. Final results show that the pre-trained transformers, especially the RoBERTa model, outperform the transformer model that is trained from scratch, and manage to generate fairly efficient abstractive product review summaries.
author2 Sun Aixin
author_facet Sun Aixin
Chng, Charlotte
format Final Year Project
author Chng, Charlotte
author_sort Chng, Charlotte
title Product review summarization
title_short Product review summarization
title_full Product review summarization
title_fullStr Product review summarization
title_full_unstemmed Product review summarization
title_sort product review summarization
publisher Nanyang Technological University
publishDate 2021
url https://hdl.handle.net/10356/153434
_version_ 1718368039033896960