Question answering model using deep learning with attention mechanism and its applications
This project is about the experimental study and implementations of Question & Answer(Q&A) systems for possible applications. Q&A is sub-domain under Natural Language Processing (NLP) that focuses on machine understanding and answering of human questions based on a given context. Bac...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2019
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/77955 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-77955 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-779552023-07-07T15:54:04Z Question answering model using deep learning with attention mechanism and its applications Chew, Aaron Weng Kit Chen Lihui School of Electrical and Electronic Engineering DRNTU::Engineering::Electrical and electronic engineering This project is about the experimental study and implementations of Question & Answer(Q&A) systems for possible applications. Q&A is sub-domain under Natural Language Processing (NLP) that focuses on machine understanding and answering of human questions based on a given context. Background knowledge on traditional NLP methods and components reviewed and discussed. A comprehensive study was done on current state-of-the-art Q&A models and their useful components like the attention mechanism, the transformer, pointer networks. A performance comparison was made between available online implementations of the models and their published scores before being used as the implementation benchmark for SQuAD. Three revised models were subsequently proposed to explore if any improvements could be made on Q&A related tasks. BERT was considered for the expansions to the three different models namely Singlish BERT, Multilingual BERT and Q&A in a System, as BERT had the most generalised language model that could be tweaked to meet specific tasks. Despite having no improvements made, those three models were used as a platform to understand the concepts and implementations of the individual components in each model for integration with others for potential Q&A related applications. Prototypes of some applications have been developed to show the ideas and potentials of those systems. This report highlights the reviews, methodologies, implementation details used in the experimental studies, followed by discussions and analysis of the results obtained. These findings are important groundwork and as compilation of the important concepts that are necessary to understanding the NLP/Q&A domain. It also highlights a clear direction on what can be further improved backed by comprehensive tests and results Bachelor of Engineering (Information Engineering and Media) 2019-06-10T06:37:51Z 2019-06-10T06:37:51Z 2019 Final Year Project (FYP) http://hdl.handle.net/10356/77955 en Nanyang Technological University 58 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Engineering::Electrical and electronic engineering |
spellingShingle |
DRNTU::Engineering::Electrical and electronic engineering Chew, Aaron Weng Kit Question answering model using deep learning with attention mechanism and its applications |
description |
This project is about the experimental study and implementations of Question & Answer(Q&A)
systems for possible applications. Q&A is sub-domain under Natural Language Processing (NLP)
that focuses on machine understanding and answering of human questions based on a given context.
Background knowledge on traditional NLP methods and components reviewed and discussed. A
comprehensive study was done on current state-of-the-art Q&A models and their useful components
like the attention mechanism, the transformer, pointer networks. A performance comparison was
made between available online implementations of the models and their published scores before
being used as the implementation benchmark for SQuAD. Three revised models were subsequently
proposed to explore if any improvements could be made on Q&A related tasks. BERT was
considered for the expansions to the three different models namely Singlish BERT, Multilingual
BERT and Q&A in a System, as BERT had the most generalised language model that could be
tweaked to meet specific tasks. Despite having no improvements made, those three models were
used as a platform to understand the concepts and implementations of the individual components in
each model for integration with others for potential Q&A related applications. Prototypes of some
applications have been developed to show the ideas and potentials of those systems.
This report highlights the reviews, methodologies, implementation details used in the experimental
studies, followed by discussions and analysis of the results obtained. These findings are important
groundwork and as compilation of the important concepts that are necessary to understanding the
NLP/Q&A domain. It also highlights a clear direction on what can be further improved backed by
comprehensive tests and results |
author2 |
Chen Lihui |
author_facet |
Chen Lihui Chew, Aaron Weng Kit |
format |
Final Year Project |
author |
Chew, Aaron Weng Kit |
author_sort |
Chew, Aaron Weng Kit |
title |
Question answering model using deep learning with attention mechanism and its applications |
title_short |
Question answering model using deep learning with attention mechanism and its applications |
title_full |
Question answering model using deep learning with attention mechanism and its applications |
title_fullStr |
Question answering model using deep learning with attention mechanism and its applications |
title_full_unstemmed |
Question answering model using deep learning with attention mechanism and its applications |
title_sort |
question answering model using deep learning with attention mechanism and its applications |
publishDate |
2019 |
url |
http://hdl.handle.net/10356/77955 |
_version_ |
1772827300893556736 |