Triple-attention computation model for question answering
In order to assess the degree of intelligence the machine, the machine's understanding of the language is an indispensable and important aspect. The question answering system is an important task for the machine to understand human language. Thesis proposes a question and answer system model...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Theses and Dissertations |
Language: | English |
Published: |
2018
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/75953 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | In order to assess the degree of intelligence the machine, the machine's understanding
of the language is an indispensable and important aspect. The question answering
system is an important task for the machine to understand human language.
Thesis proposes a question and answer system model based on three kinds of attention
calculations. Comparing with other existing models, the calculation of the three
attention fully extracted the information between the context and the question in
different aspects, making the neural network better learn the context-based
representation of the question. The model consists of three layers, embedding layer,
attention layer, and predict layer. The role of embedding layer is to vectorize the words
in the context and question. The attention layer first calculates the mutual attention
between the context and the question, and then calculates the Self-Attention. Finally,
the predictive layer is used to predict the start and end of the answer.
Through experiments on the SQuAD dataset, the performance of the model using the
different RNN architectures is better than that of the main reference model in both EM
and F1 values. In addition, the performance of this model has performed well in many
question answering models proposed in recent years, surpassing many classical
models, and has strong competitiveness.
Keywords: Natural language processing, Recurrent neural network, Question
answering, Attention mechanism |
---|