Enhance QANet by BERT for machine reading comprehension

The aim of this dissertation is to study the implementation of QANet on SQuAD for the machine reading comprehension task, and enhance QANet by introducing the contextual representation BERT architecture for the task. First of all, several techniques employed to capture the dependencies among words...

Full description

Saved in:
Bibliographic Details
Main Author: Yin, Bo
Other Authors: Chen Lihui
Format: Theses and Dissertations
Language:English
Published: 2019
Subjects:
Online Access:http://hdl.handle.net/10356/78809
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:The aim of this dissertation is to study the implementation of QANet on SQuAD for the machine reading comprehension task, and enhance QANet by introducing the contextual representation BERT architecture for the task. First of all, several techniques employed to capture the dependencies among words in the question answering task are reviewed, including Recurrent Neural Network and the attention mechanism. Then a review is conducted about the heated NLP downstream task: question answering and its subclasses, which contains the most important machine reading comprehension task. It follows by a review of two categories of reading comprehension models. Then the word representation methods are introduced in detail and how they evolved across the whole time is provided. The architecture of one reading comprehension model QANet is reviewed with introduction to each layer. The most important part of this dissertation is about understanding the implementation of the QANet and finding the best performance of the original QANet by tuning hyperparameters. It follows by applying the state-of-art contextual word representation method BERT to enhance QANet. The experiment results depict that BERT improve F1 score on validation dataset by 7.2 percent. Ablation study has been conducted to verify how those model components of QANet can contribute to the overall performance of the model, such as the application of attention mechanism to capture the global interactions. The QANet with BERT model is also compared to BiDAF with BERT model to demonstrate that QANet with BERT may have better performance than another model using RNNs with BERT on some datasets.