Multi-task learning with multi-view attention for answer selection and knowledge base question answering

Answer selection and knowledge base question answering (KBQA) are two important tasks of question answering (QA) systems. Existing methods solve these two tasks separately, which requires large number of repetitive work and neglects the rich correlation information between tasks. In this paper, we t...

Full description

Saved in:
Bibliographic Details
Main Authors: DENG, Yang, XIE, Yuexiang, LI, Yaliang, YANG, Min, DU, Nan, FAN, Wei, LEI, Kai, SHEN, Ying
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2019
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/9108
https://ink.library.smu.edu.sg/context/sis_research/article/10111/viewcontent/4593_Article_Text_7632_1_10_20190707.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-10111
record_format dspace
spelling sg-smu-ink.sis_research-101112024-08-01T14:53:53Z Multi-task learning with multi-view attention for answer selection and knowledge base question answering DENG, Yang XIE, Yuexiang LI, Yaliang YANG, Min DU, Nan FAN, Wei LEI, Kai SHEN, Ying Answer selection and knowledge base question answering (KBQA) are two important tasks of question answering (QA) systems. Existing methods solve these two tasks separately, which requires large number of repetitive work and neglects the rich correlation information between tasks. In this paper, we tackle answer selection and KBQA tasks simultaneously via multi-task learning (MTL), motivated by the following motivations. First, both answer selection and KBQA can be regarded as a ranking problem, with one at text-level while the other at knowledge-level. Second, these two tasks can benefit each other: answer selection can incorporate the external knowledge from knowledge base (KB), while KBQA can be improved by learning contextual information from answer selection. To fulfill the goal of jointly learning these two tasks, we propose a novel multi-task learning scheme that utilizes multi-view attention learned from various perspectives to enable these tasks to interact with each other as well as learn more comprehensive sentence representations. The experiments conducted on several real-world datasets demonstrate the effectiveness of the proposed method, and the performance of answer selection and KBQA is improved. Also, the multi-view attention scheme is proved to be effective in assembling attentive information from different representational perspectives. 2019-02-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/9108 info:doi/10.1609/AAAI.V33I01.33016318 https://ink.library.smu.edu.sg/context/sis_research/article/10111/viewcontent/4593_Article_Text_7632_1_10_20190707.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Databases and Information Systems
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Databases and Information Systems
spellingShingle Databases and Information Systems
DENG, Yang
XIE, Yuexiang
LI, Yaliang
YANG, Min
DU, Nan
FAN, Wei
LEI, Kai
SHEN, Ying
Multi-task learning with multi-view attention for answer selection and knowledge base question answering
description Answer selection and knowledge base question answering (KBQA) are two important tasks of question answering (QA) systems. Existing methods solve these two tasks separately, which requires large number of repetitive work and neglects the rich correlation information between tasks. In this paper, we tackle answer selection and KBQA tasks simultaneously via multi-task learning (MTL), motivated by the following motivations. First, both answer selection and KBQA can be regarded as a ranking problem, with one at text-level while the other at knowledge-level. Second, these two tasks can benefit each other: answer selection can incorporate the external knowledge from knowledge base (KB), while KBQA can be improved by learning contextual information from answer selection. To fulfill the goal of jointly learning these two tasks, we propose a novel multi-task learning scheme that utilizes multi-view attention learned from various perspectives to enable these tasks to interact with each other as well as learn more comprehensive sentence representations. The experiments conducted on several real-world datasets demonstrate the effectiveness of the proposed method, and the performance of answer selection and KBQA is improved. Also, the multi-view attention scheme is proved to be effective in assembling attentive information from different representational perspectives.
format text
author DENG, Yang
XIE, Yuexiang
LI, Yaliang
YANG, Min
DU, Nan
FAN, Wei
LEI, Kai
SHEN, Ying
author_facet DENG, Yang
XIE, Yuexiang
LI, Yaliang
YANG, Min
DU, Nan
FAN, Wei
LEI, Kai
SHEN, Ying
author_sort DENG, Yang
title Multi-task learning with multi-view attention for answer selection and knowledge base question answering
title_short Multi-task learning with multi-view attention for answer selection and knowledge base question answering
title_full Multi-task learning with multi-view attention for answer selection and knowledge base question answering
title_fullStr Multi-task learning with multi-view attention for answer selection and knowledge base question answering
title_full_unstemmed Multi-task learning with multi-view attention for answer selection and knowledge base question answering
title_sort multi-task learning with multi-view attention for answer selection and knowledge base question answering
publisher Institutional Knowledge at Singapore Management University
publishDate 2019
url https://ink.library.smu.edu.sg/sis_research/9108
https://ink.library.smu.edu.sg/context/sis_research/article/10111/viewcontent/4593_Article_Text_7632_1_10_20190707.pdf
_version_ 1814047743667077120