Aspect-based sentiment analysis using BERT
Sentiment Analysis is a widely adopted approach to extract sentiments from an opinion text. Sentiment analysis tasks usually assume that the entire text has an overall polarity and does not consider a text having different targets expressing different sentiments. Therefore, aspect-based sentiment an...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/156494 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Sentiment Analysis is a widely adopted approach to extract sentiments from an opinion text. Sentiment analysis tasks usually assume that the entire text has an overall polarity and does not consider a text having different targets expressing different sentiments. Therefore, aspect-based sentiment analysis, which is a subtask under sentiment analysis is increasingly becoming popular to address this issue.
Aspect-based sentiment analysis extracts and identifies fine-grained sentiment polarities for a specific aspect. This experimental study aims to implement and evaluate novel architectures for the purpose of the aspect-based sentiment analysis problem. A combination of different datasets, SemEval 2014 and Sentihood, were used for this experiment. Evaluations are also conducted to measure the performance of the model for the respective aspect detection and aspect sentiment classification stages. Previously used supervised and unsupervised deep learning techniques as well as word embedding techniques are studied and discussed. State of the art Bidirectional Encoder Representations from Transformers (BERT) pre-training transformer model is the popular choice in the field of Natural Language Processing (NLP) and gives reliable performance in tasks such as question answering and Natural language Inferencing (NLI). This project implements a method of using the pre-trained BERT model for this experiment. The project is then concluded with further evaluations, error analysis, and discussions. |
---|