Multi-hop question generation with knowledge graph-enhanced language model
The task of multi-hop question generation (QG) seeks to generate questions that require a complex reasoning process that spans multiple sentences and answers. Beyond the conventional challenges of what to ask and how to ask, multi-hop QG necessitates sophisticated reasoning from dispersed evidence a...
Saved in:
Main Authors: | , , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/169522 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-169522 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1695222023-07-21T15:40:39Z Multi-hop question generation with knowledge graph-enhanced language model Li, Zhenping Cao, Zhen Li, Pengfei Zhong, Yong Li, Shaobo School of Electrical and Electronic Engineering Engineering::Electrical and electronic engineering Multi-Hop Question Generation Graph Neural Network The task of multi-hop question generation (QG) seeks to generate questions that require a complex reasoning process that spans multiple sentences and answers. Beyond the conventional challenges of what to ask and how to ask, multi-hop QG necessitates sophisticated reasoning from dispersed evidence across multiple sentences. To address these challenges, a knowledge graph-enhanced language model (KGEL) has been developed to imitate human reasoning for multi-hop questions.The initial step in KGEL involves encoding the input sentence with a pre-trained GPT-2 language model to obtain a comprehensive semantic context representation. Next, a knowledge graph is constructed using the entities identified within the context. The critical information in the graph that is related to the answer is then utilized to update the context representations through an answer-aware graph attention network (GAT). Finally, the multi-head attention generation module (MHAG) is performed over the updated latent representations of the context to generate coherent questions. Human evaluations demonstrate that KGEL generates more logical and fluent multi-hop questions compared to GPT-2. Furthermore, KGEL outperforms five prominent baselines in automatic evaluations, with a BLEU-4 score that is 27% higher than that of GPT-2. Published version This work was supported by the AI industrial technology innovation platform of Sichuan Province, grant number “2020ZHCG0002”. 2023-07-21T06:46:58Z 2023-07-21T06:46:58Z 2023 Journal Article Li, Z., Cao, Z., Li, P., Zhong, Y. & Li, S. (2023). Multi-hop question generation with knowledge graph-enhanced language model. Applied Sciences, 13(9), 5765-. https://dx.doi.org/10.3390/app13095765 2076-3417 https://hdl.handle.net/10356/169522 10.3390/app13095765 2-s2.0-85159323152 9 13 5765 en Applied Sciences © 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Electrical and electronic engineering Multi-Hop Question Generation Graph Neural Network |
spellingShingle |
Engineering::Electrical and electronic engineering Multi-Hop Question Generation Graph Neural Network Li, Zhenping Cao, Zhen Li, Pengfei Zhong, Yong Li, Shaobo Multi-hop question generation with knowledge graph-enhanced language model |
description |
The task of multi-hop question generation (QG) seeks to generate questions that require a complex reasoning process that spans multiple sentences and answers. Beyond the conventional challenges of what to ask and how to ask, multi-hop QG necessitates sophisticated reasoning from dispersed evidence across multiple sentences. To address these challenges, a knowledge graph-enhanced language model (KGEL) has been developed to imitate human reasoning for multi-hop questions.The initial step in KGEL involves encoding the input sentence with a pre-trained GPT-2 language model to obtain a comprehensive semantic context representation. Next, a knowledge graph is constructed using the entities identified within the context. The critical information in the graph that is related to the answer is then utilized to update the context representations through an answer-aware graph attention network (GAT). Finally, the multi-head attention generation module (MHAG) is performed over the updated latent representations of the context to generate coherent questions. Human evaluations demonstrate that KGEL generates more logical and fluent multi-hop questions compared to GPT-2. Furthermore, KGEL outperforms five prominent baselines in automatic evaluations, with a BLEU-4 score that is 27% higher than that of GPT-2. |
author2 |
School of Electrical and Electronic Engineering |
author_facet |
School of Electrical and Electronic Engineering Li, Zhenping Cao, Zhen Li, Pengfei Zhong, Yong Li, Shaobo |
format |
Article |
author |
Li, Zhenping Cao, Zhen Li, Pengfei Zhong, Yong Li, Shaobo |
author_sort |
Li, Zhenping |
title |
Multi-hop question generation with knowledge graph-enhanced language model |
title_short |
Multi-hop question generation with knowledge graph-enhanced language model |
title_full |
Multi-hop question generation with knowledge graph-enhanced language model |
title_fullStr |
Multi-hop question generation with knowledge graph-enhanced language model |
title_full_unstemmed |
Multi-hop question generation with knowledge graph-enhanced language model |
title_sort |
multi-hop question generation with knowledge graph-enhanced language model |
publishDate |
2023 |
url |
https://hdl.handle.net/10356/169522 |
_version_ |
1773551325395550208 |