Knowledge-aware attentive neural network for ranking question answer pairs

Ranking question answer pairs has attracted increasing attention recently due to its broad applications such as information retrieval and question answering (QA). Significant progresses have been made by deep neural networks. However, background information and hidden relations beyond the context, w...

Full description

Saved in:
Bibliographic Details
Main Authors: SHEN, Ying, DENG, Yang, YANG, Min, LI, Yaliang, DU, Nan, FAN, Wei, LEI, Kai
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2018
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/9103
https://ink.library.smu.edu.sg/context/sis_research/article/10106/viewcontent/3209978.3210081.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-10106
record_format dspace
spelling sg-smu-ink.sis_research-101062024-08-01T15:03:39Z Knowledge-aware attentive neural network for ranking question answer pairs SHEN, Ying DENG, Yang YANG, Min LI, Yaliang DU, Nan FAN, Wei LEI, Kai Ranking question answer pairs has attracted increasing attention recently due to its broad applications such as information retrieval and question answering (QA). Significant progresses have been made by deep neural networks. However, background information and hidden relations beyond the context, which play crucial roles in human text comprehension, have received little attention in recent deep neural networks that achieve the state of the art in ranking QA pairs. In the paper, we propose KABLSTM, a Knowledge-aware Attentive Bidirectional Long Short-Term Memory, which leverages external knowledge from knowledge graphs (KG) to enrich the representational learning of QA sentences. Specifically, we develop a context-knowledge interactive learning architecture, in which a context-guided attentive convolutional neural network (CNN) is designed to integrate knowledge embeddings into sentence representations. Besides, a knowledge-aware attention mechanism is presented to attend interrelations between each segments of QA pairs. KABLSTM is evaluated on two widely-used benchmark QA datasets: WikiQA and TREC QA. Experiment results demonstrate that KABLSTM has robust superiority over competitors and sets state-of-the-art. 2018-07-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/9103 info:doi/10.1145/3209978.3210081 https://ink.library.smu.edu.sg/context/sis_research/article/10106/viewcontent/3209978.3210081.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Information systems Question answering Databases and Information Systems OS and Networks
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Information systems
Question answering
Databases and Information Systems
OS and Networks
spellingShingle Information systems
Question answering
Databases and Information Systems
OS and Networks
SHEN, Ying
DENG, Yang
YANG, Min
LI, Yaliang
DU, Nan
FAN, Wei
LEI, Kai
Knowledge-aware attentive neural network for ranking question answer pairs
description Ranking question answer pairs has attracted increasing attention recently due to its broad applications such as information retrieval and question answering (QA). Significant progresses have been made by deep neural networks. However, background information and hidden relations beyond the context, which play crucial roles in human text comprehension, have received little attention in recent deep neural networks that achieve the state of the art in ranking QA pairs. In the paper, we propose KABLSTM, a Knowledge-aware Attentive Bidirectional Long Short-Term Memory, which leverages external knowledge from knowledge graphs (KG) to enrich the representational learning of QA sentences. Specifically, we develop a context-knowledge interactive learning architecture, in which a context-guided attentive convolutional neural network (CNN) is designed to integrate knowledge embeddings into sentence representations. Besides, a knowledge-aware attention mechanism is presented to attend interrelations between each segments of QA pairs. KABLSTM is evaluated on two widely-used benchmark QA datasets: WikiQA and TREC QA. Experiment results demonstrate that KABLSTM has robust superiority over competitors and sets state-of-the-art.
format text
author SHEN, Ying
DENG, Yang
YANG, Min
LI, Yaliang
DU, Nan
FAN, Wei
LEI, Kai
author_facet SHEN, Ying
DENG, Yang
YANG, Min
LI, Yaliang
DU, Nan
FAN, Wei
LEI, Kai
author_sort SHEN, Ying
title Knowledge-aware attentive neural network for ranking question answer pairs
title_short Knowledge-aware attentive neural network for ranking question answer pairs
title_full Knowledge-aware attentive neural network for ranking question answer pairs
title_fullStr Knowledge-aware attentive neural network for ranking question answer pairs
title_full_unstemmed Knowledge-aware attentive neural network for ranking question answer pairs
title_sort knowledge-aware attentive neural network for ranking question answer pairs
publisher Institutional Knowledge at Singapore Management University
publishDate 2018
url https://ink.library.smu.edu.sg/sis_research/9103
https://ink.library.smu.edu.sg/context/sis_research/article/10106/viewcontent/3209978.3210081.pdf
_version_ 1814047742251499520