Neural collective entity linking

Entity Linking aims to link entity mentions in texts to knowledge bases, and neural models have achieved recent success in this task. However, most existing methods rely on local contexts to resolve entities independently, which may usually fail due to the data sparsity of local information. To addr...

Full description

Saved in:
Bibliographic Details
Main Authors: CAO, Yixin, HOU, Lei, LI, Juanzi, LIU, Zhiyuan
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2018
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/7466
https://ink.library.smu.edu.sg/context/sis_research/article/8469/viewcontent/C18_1057.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-8469
record_format dspace
spelling sg-smu-ink.sis_research-84692022-11-22T07:13:03Z Neural collective entity linking CAO, Yixin HOU, Lei LI, Juanzi LIU, Zhiyuan Entity Linking aims to link entity mentions in texts to knowledge bases, and neural models have achieved recent success in this task. However, most existing methods rely on local contexts to resolve entities independently, which may usually fail due to the data sparsity of local information. To address this issue, we propose a novel neural model for collective entity linking, named as NCEL. NCEL applies Graph Convolutional Network to integrate both local contextual features and global coherence information for entity linking. To improve the computation efficiency, we approximately perform graph convolution on a subgraph of adjacent entity mentions instead of those in the entire text. We further introduce an attention scheme to improve the robustness of NCEL to data noise and train the model on Wikipedia hyperlinks to avoid overfitting and domain bias. In experiments, we evaluate NCEL on five publicly available datasets to verify the linking performance as well as generalization ability. We also conduct an extensive analysis of time complexity, the impact of key modules, and qualitative results, which demonstrate the effectiveness and efficiency of our proposed method. 2018-08-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/7466 https://ink.library.smu.edu.sg/context/sis_research/article/8469/viewcontent/C18_1057.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Databases and Information Systems Graphics and Human Computer Interfaces
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Databases and Information Systems
Graphics and Human Computer Interfaces
spellingShingle Databases and Information Systems
Graphics and Human Computer Interfaces
CAO, Yixin
HOU, Lei
LI, Juanzi
LIU, Zhiyuan
Neural collective entity linking
description Entity Linking aims to link entity mentions in texts to knowledge bases, and neural models have achieved recent success in this task. However, most existing methods rely on local contexts to resolve entities independently, which may usually fail due to the data sparsity of local information. To address this issue, we propose a novel neural model for collective entity linking, named as NCEL. NCEL applies Graph Convolutional Network to integrate both local contextual features and global coherence information for entity linking. To improve the computation efficiency, we approximately perform graph convolution on a subgraph of adjacent entity mentions instead of those in the entire text. We further introduce an attention scheme to improve the robustness of NCEL to data noise and train the model on Wikipedia hyperlinks to avoid overfitting and domain bias. In experiments, we evaluate NCEL on five publicly available datasets to verify the linking performance as well as generalization ability. We also conduct an extensive analysis of time complexity, the impact of key modules, and qualitative results, which demonstrate the effectiveness and efficiency of our proposed method.
format text
author CAO, Yixin
HOU, Lei
LI, Juanzi
LIU, Zhiyuan
author_facet CAO, Yixin
HOU, Lei
LI, Juanzi
LIU, Zhiyuan
author_sort CAO, Yixin
title Neural collective entity linking
title_short Neural collective entity linking
title_full Neural collective entity linking
title_fullStr Neural collective entity linking
title_full_unstemmed Neural collective entity linking
title_sort neural collective entity linking
publisher Institutional Knowledge at Singapore Management University
publishDate 2018
url https://ink.library.smu.edu.sg/sis_research/7466
https://ink.library.smu.edu.sg/context/sis_research/article/8469/viewcontent/C18_1057.pdf
_version_ 1770576343290871808