INTEGRATING FIRST-ORDER LOGIC FOR READING COMPREHENSION MODEL ON RECLOR DATASET
ReClor is a Machine Reading Comprehension dataset that requires logical reasoning. Studies have shown that pre-trained language models are still weak in reasoning tasks. Several studies have been conducted to address this shortcoming. However, among the existing approaches tend to improve the lan...
Saved in:
Main Author: | |
---|---|
Format: | Theses |
Language: | Indonesia |
Online Access: | https://digilib.itb.ac.id/gdl/view/85304 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Institut Teknologi Bandung |
Language: | Indonesia |
id |
id-itb.:85304 |
---|---|
spelling |
id-itb.:853042024-08-20T10:05:55ZINTEGRATING FIRST-ORDER LOGIC FOR READING COMPREHENSION MODEL ON RECLOR DATASET Nuraeni, Wulantika Indonesia Theses machine reading comprehension, first-order logic, graph neural network, joint loss training INSTITUT TEKNOLOGI BANDUNG https://digilib.itb.ac.id/gdl/view/85304 ReClor is a Machine Reading Comprehension dataset that requires logical reasoning. Studies have shown that pre-trained language models are still weak in reasoning tasks. Several studies have been conducted to address this shortcoming. However, among the existing approaches tend to improve the language model’s capabilities through statistical features rather than performing actual reasoning. Logical reasoning systems typically involve logical symbols and strict inference rules. Some studies have attempted to combine the strenght of statistical learning and logical reasosning using hybrid approach between pre-trained language models and logic programming. Given the complexity of sentences and the diversity of question types in the ReClor data, this hybrid approach might be inadequate. This study proposes an approach using Graph Neural Networks, which have the ability to learn structured data and perform reasoning using First-Order Logic, which is more expressive in representing relationships between objects. The Graph Neural Network model then combined with a language model using joint loss training to enhance the language model’s performance on the Machine Reading Comprehension task. Study results show that the Graph Neural Network model trained on First-Order Logic data can process logical symbols and learn resolution path, but its performance heavily depends on the accuracy of the semantic parser that converts ReClor’s text data into first-order logic. Additionally, the combination of loss values from the graph model and language model effectively improved RoBERTa’s performance but did not significantly enhance BERT’s performance. This outcome is due to the next-sentence prediction task during BERT’s pre-training, which may limit the performace in downstream tasks, unlike RoBERTa, which removes this task during pre-trainig and achieves accuracy improvements using proposed method. text |
institution |
Institut Teknologi Bandung |
building |
Institut Teknologi Bandung Library |
continent |
Asia |
country |
Indonesia Indonesia |
content_provider |
Institut Teknologi Bandung |
collection |
Digital ITB |
language |
Indonesia |
description |
ReClor is a Machine Reading Comprehension dataset that requires logical
reasoning. Studies have shown that pre-trained language models are still weak in
reasoning tasks. Several studies have been conducted to address this shortcoming.
However, among the existing approaches tend to improve the language model’s
capabilities through statistical features rather than performing actual reasoning.
Logical reasoning systems typically involve logical symbols and strict inference
rules. Some studies have attempted to combine the strenght of statistical learning
and logical reasosning using hybrid approach between pre-trained language
models and logic programming. Given the complexity of sentences and the diversity
of question types in the ReClor data, this hybrid approach might be inadequate.
This study proposes an approach using Graph Neural Networks, which have the
ability to learn structured data and perform reasoning using First-Order Logic,
which is more expressive in representing relationships between objects. The Graph
Neural Network model then combined with a language model using joint loss
training to enhance the language model’s performance on the Machine Reading
Comprehension task.
Study results show that the Graph Neural Network model trained on First-Order
Logic data can process logical symbols and learn resolution path, but its
performance heavily depends on the accuracy of the semantic parser that converts
ReClor’s text data into first-order logic. Additionally, the combination of loss
values from the graph model and language model effectively improved RoBERTa’s
performance but did not significantly enhance BERT’s performance. This outcome
is due to the next-sentence prediction task during BERT’s pre-training, which may
limit the performace in downstream tasks, unlike RoBERTa, which removes this
task during pre-trainig and achieves accuracy improvements using proposed
method. |
format |
Theses |
author |
Nuraeni, Wulantika |
spellingShingle |
Nuraeni, Wulantika INTEGRATING FIRST-ORDER LOGIC FOR READING COMPREHENSION MODEL ON RECLOR DATASET |
author_facet |
Nuraeni, Wulantika |
author_sort |
Nuraeni, Wulantika |
title |
INTEGRATING FIRST-ORDER LOGIC FOR READING COMPREHENSION MODEL ON RECLOR DATASET |
title_short |
INTEGRATING FIRST-ORDER LOGIC FOR READING COMPREHENSION MODEL ON RECLOR DATASET |
title_full |
INTEGRATING FIRST-ORDER LOGIC FOR READING COMPREHENSION MODEL ON RECLOR DATASET |
title_fullStr |
INTEGRATING FIRST-ORDER LOGIC FOR READING COMPREHENSION MODEL ON RECLOR DATASET |
title_full_unstemmed |
INTEGRATING FIRST-ORDER LOGIC FOR READING COMPREHENSION MODEL ON RECLOR DATASET |
title_sort |
integrating first-order logic for reading comprehension model on reclor dataset |
url |
https://digilib.itb.ac.id/gdl/view/85304 |
_version_ |
1822283087256485888 |