Improving the numerical reasoning skills of question answer systems in finance
The field of natural language processing (NLP) has witnessed remarkable progress in recent years. Among the many applications, question answering (Q/A) systems have emerged as a crucial tool enabling access to information from various domains. In the financial sector, where complex calculations a...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/175338 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-175338 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1753382024-04-26T15:42:36Z Improving the numerical reasoning skills of question answer systems in finance Bhatia Nipun Shen Zhiqi School of Computer Science and Engineering ZQShen@ntu.edu.sg Computer and Information Science NLP AI Question answer systems Finance The field of natural language processing (NLP) has witnessed remarkable progress in recent years. Among the many applications, question answering (Q/A) systems have emerged as a crucial tool enabling access to information from various domains. In the financial sector, where complex calculations and in-depth analysis are required, accuracy and correctness are of the utmost importance for a Q/A system. Hence, such a system holds immense potential and value in this domain. In this paper, we first introduce a novel method for improving the Text-to-Text Transfer Transformer (T5) model's performance on question answering tasks based on the Discrete Reasoning Over Paragraphs (DROP) dataset. The key innovation proposed in this study is the introduction of a custom numerical attention layer, which is intended to improve the model's performance in answering numerical queries, an area in which it has traditionally had considerable difficulties. This layer updates the model's representations to increase the focus on the portions of the input text containing numerical values and other references related to those values. Furthermore, it also provides explicit question types as input to the model, guiding it to better understand the task at hand. We then fine-tune this enhanced T5 model with the numerical attention layer, on a wide variety of Q/A datasets, including DROP, to improve its general task knowledge and performance capabilities, after which we compare its performance against other state-of-the-art models, including those employing traditional fine-tuning strategies as well as techniques aimed at improving numerical reasoning. The results demonstrate a notable improvement in T5's numerical Q/A proficiency when incorporating the proposed numerical attention layer with explicit question type information. This work aims to contribute to the field of NLP by demonstrating how tailored model architecture enhancements and dataset augmentation can improve state-of-the-art models on complex domain-specific tasks, enabling them to carry out accurate financial question answering on par with human analysts. Bachelor's degree 2024-04-23T01:36:20Z 2024-04-23T01:36:20Z 2024 Final Year Project (FYP) Bhatia Nipun (2024). Improving the numerical reasoning skills of question answer systems in finance. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/175338 https://hdl.handle.net/10356/175338 en SCSE23-0441 application/pdf Nanyang Technological University |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Computer and Information Science NLP AI Question answer systems Finance |
spellingShingle |
Computer and Information Science NLP AI Question answer systems Finance Bhatia Nipun Improving the numerical reasoning skills of question answer systems in finance |
description |
The field of natural language processing (NLP) has witnessed remarkable progress
in recent years. Among the many applications, question answering (Q/A) systems
have emerged as a crucial tool enabling access to information from various
domains. In the financial sector, where complex calculations and in-depth analysis
are required, accuracy and correctness are of the utmost importance for a Q/A
system. Hence, such a system holds immense potential and value in this domain.
In this paper, we first introduce a novel method for improving the Text-to-Text
Transfer Transformer (T5) model's performance on question answering tasks based
on the Discrete Reasoning Over Paragraphs (DROP) dataset. The key innovation
proposed in this study is the introduction of a custom numerical attention layer,
which is intended to improve the model's performance in answering numerical
queries, an area in which it has traditionally had considerable difficulties. This
layer updates the model's representations to increase the focus on the portions of
the input text containing numerical values and other references related to those
values. Furthermore, it also provides explicit question types as input to the model,
guiding it to better understand the task at hand. We then fine-tune this enhanced T5
model with the numerical attention layer, on a wide variety of Q/A datasets,
including DROP, to improve its general task knowledge and performance
capabilities, after which we compare its performance against other state-of-the-art
models, including those employing traditional fine-tuning strategies as well as
techniques aimed at improving numerical reasoning.
The results demonstrate a notable improvement in T5's numerical Q/A proficiency
when incorporating the proposed numerical attention layer with explicit question
type information. This work aims to contribute to the field of NLP by
demonstrating how tailored model architecture enhancements and dataset
augmentation can improve state-of-the-art models on complex domain-specific
tasks, enabling them to carry out accurate financial question answering on par with
human analysts. |
author2 |
Shen Zhiqi |
author_facet |
Shen Zhiqi Bhatia Nipun |
format |
Final Year Project |
author |
Bhatia Nipun |
author_sort |
Bhatia Nipun |
title |
Improving the numerical reasoning skills of question answer systems in finance |
title_short |
Improving the numerical reasoning skills of question answer systems in finance |
title_full |
Improving the numerical reasoning skills of question answer systems in finance |
title_fullStr |
Improving the numerical reasoning skills of question answer systems in finance |
title_full_unstemmed |
Improving the numerical reasoning skills of question answer systems in finance |
title_sort |
improving the numerical reasoning skills of question answer systems in finance |
publisher |
Nanyang Technological University |
publishDate |
2024 |
url |
https://hdl.handle.net/10356/175338 |
_version_ |
1800916379405123584 |