Product review summarization
The need for automatic summarization systems that can condense a product review into a digestible summary is important to help consumers arrive at a purchasing decision quickly. In this paper, we will be focusing on abstractive summarization, which is a summarization technique that paraphrases, rath...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2021
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/153434 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | The need for automatic summarization systems that can condense a product review into a digestible summary is important to help consumers arrive at a purchasing decision quickly. In this paper, we will be focusing on abstractive summarization, which is a summarization technique that paraphrases, rather than copies important information in a text to create a summary. Since abstractive summarization is a relatively nascent field, pre-trained transformer models have not been widely utilized to produce product review summaries yet. This paper thus aims to apply pre-trained transformer models to address the oversight and generate more efficient abstractive summaries for product reviews. In our experiment, we used the publicly available Amazon fine food reviews dataset to fine-tune a Bidirectional Representation for Transformers (BERT) model that has been pre-trained on Yelp and a separate Amazon review dataset, as well as a Robustly optimized BERT approach (RoBERTa) model. We then compared their Recall-Oriented Understudy for Gisting Evaluation (ROUGE) scores with a transformer model that has been trained from scratch. Final results show that the pre-trained transformers, especially the RoBERTa model, outperform the transformer model that is trained from scratch, and manage to generate fairly efficient abstractive product review summaries. |
---|