Simple and effective curriculum pointer-generator networks for reading comprehension over long narratives

This paper tackles the problem of reading comprehension over long narratives where documents easily span over thousands of tokens. We propose a curriculum learning (CL) based Pointer-Generator framework for reading/sampling over large documents, enabling diverse training of the neural model based on...

Full description

Saved in:
Bibliographic Details
Main Authors: TAY, Yi, WANG, Shuohang, LUU, Anh Tuan, FU, Jie, PHAN, Minh C., YUAN, Xingdi, RAO, Jinfeng, HUI, Siu Cheung, ZHANG, Aston
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2019
Subjects:
Online Access:https://ink.library.smu.edu.sg/scis_studentpub/1
https://ink.library.smu.edu.sg/context/scis_studentpub/article/1000/viewcontent/P19_1486.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.scis_studentpub-1000
record_format dspace
spelling sg-smu-ink.scis_studentpub-10002025-02-11T09:08:40Z Simple and effective curriculum pointer-generator networks for reading comprehension over long narratives TAY, Yi WANG, Shuohang LUU, Anh Tuan FU, Jie PHAN, Minh C. YUAN, Xingdi RAO, Jinfeng HUI, Siu Cheung ZHANG, Aston This paper tackles the problem of reading comprehension over long narratives where documents easily span over thousands of tokens. We propose a curriculum learning (CL) based Pointer-Generator framework for reading/sampling over large documents, enabling diverse training of the neural model based on the notion of alternating contextual difficulty. This can be interpreted as a form of domain randomization and/or generative pretraining during training. To this end, the usage of the Pointer-Generator softens the requirement of having the answer within the context, enabling us to construct diverse training samples for learning. Additionally, we propose a new Introspective Alignment Layer (IAL), which reasons over decomposed alignments using block-based self-attention. We evaluate our proposed method on the NarrativeQA reading comprehension benchmark, achieving state-of-the-art performance, improving existing baselines by 51% relative improvement on BLEU-4 and 17% relative improvement on Rouge-L. Extensive ablations confirm the effectiveness of our proposed IAL and CL components. 2019-07-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/scis_studentpub/1 info:doi/10.18653/v1/P19-1486 https://ink.library.smu.edu.sg/context/scis_studentpub/article/1000/viewcontent/P19_1486.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ SCIS Student Publications eng Institutional Knowledge at Singapore Management University OS and Networks
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic OS and Networks
spellingShingle OS and Networks
TAY, Yi
WANG, Shuohang
LUU, Anh Tuan
FU, Jie
PHAN, Minh C.
YUAN, Xingdi
RAO, Jinfeng
HUI, Siu Cheung
ZHANG, Aston
Simple and effective curriculum pointer-generator networks for reading comprehension over long narratives
description This paper tackles the problem of reading comprehension over long narratives where documents easily span over thousands of tokens. We propose a curriculum learning (CL) based Pointer-Generator framework for reading/sampling over large documents, enabling diverse training of the neural model based on the notion of alternating contextual difficulty. This can be interpreted as a form of domain randomization and/or generative pretraining during training. To this end, the usage of the Pointer-Generator softens the requirement of having the answer within the context, enabling us to construct diverse training samples for learning. Additionally, we propose a new Introspective Alignment Layer (IAL), which reasons over decomposed alignments using block-based self-attention. We evaluate our proposed method on the NarrativeQA reading comprehension benchmark, achieving state-of-the-art performance, improving existing baselines by 51% relative improvement on BLEU-4 and 17% relative improvement on Rouge-L. Extensive ablations confirm the effectiveness of our proposed IAL and CL components.
format text
author TAY, Yi
WANG, Shuohang
LUU, Anh Tuan
FU, Jie
PHAN, Minh C.
YUAN, Xingdi
RAO, Jinfeng
HUI, Siu Cheung
ZHANG, Aston
author_facet TAY, Yi
WANG, Shuohang
LUU, Anh Tuan
FU, Jie
PHAN, Minh C.
YUAN, Xingdi
RAO, Jinfeng
HUI, Siu Cheung
ZHANG, Aston
author_sort TAY, Yi
title Simple and effective curriculum pointer-generator networks for reading comprehension over long narratives
title_short Simple and effective curriculum pointer-generator networks for reading comprehension over long narratives
title_full Simple and effective curriculum pointer-generator networks for reading comprehension over long narratives
title_fullStr Simple and effective curriculum pointer-generator networks for reading comprehension over long narratives
title_full_unstemmed Simple and effective curriculum pointer-generator networks for reading comprehension over long narratives
title_sort simple and effective curriculum pointer-generator networks for reading comprehension over long narratives
publisher Institutional Knowledge at Singapore Management University
publishDate 2019
url https://ink.library.smu.edu.sg/scis_studentpub/1
https://ink.library.smu.edu.sg/context/scis_studentpub/article/1000/viewcontent/P19_1486.pdf
_version_ 1823807413697904640