How knowledge graph and attention help? A qualitative analysis into bag-level relation extraction

Knowledge Graph (KG) and attention mechanism have been demonstrated effective in introducing and selecting useful information for weakly supervised methods. However, only qualitative analysis and ablation study are provided as evidence. In this paper, we contribute a dataset and propose a paradigm t...

Full description

Saved in:
Bibliographic Details
Main Authors: HU, Zikun, CAO, Yixin, HUANG, Lifu, CHUA, Tat-Seng
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2021
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/7448
https://ink.library.smu.edu.sg/context/sis_research/article/8451/viewcontent/2021.acl_long.359.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-8451
record_format dspace
spelling sg-smu-ink.sis_research-84512022-10-20T07:31:43Z How knowledge graph and attention help? A qualitative analysis into bag-level relation extraction HU, Zikun CAO, Yixin HUANG, Lifu CHUA, Tat-Seng Knowledge Graph (KG) and attention mechanism have been demonstrated effective in introducing and selecting useful information for weakly supervised methods. However, only qualitative analysis and ablation study are provided as evidence. In this paper, we contribute a dataset and propose a paradigm to quantitatively evaluate the effect of attention and KG on bag-level relation extraction (RE). We find that (1) higher attention accuracy may lead to worse performance as it may harm the model’s ability to extract entity mention features; (2) the performance of attention is largely influenced by various noise distribution patterns, which is closely related to real-world datasets; (3) KG-enhanced attention indeed improves RE performance, while not through enhanced attention but by incorporating entity prior; and (4) attention mechanism may exacerbate the issue of insufficient training data. Based on these findings, we show that a straightforward variant of RE model can achieve significant improvements (6% AUC on average) on two real-world datasets as compared with three state-of-the-art baselines. Our codes and datasets are available at https://github.com/zigkwin-hu/how-KG-ATT-help. 2021-08-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/7448 info:doi/10.18653/v1/2021.acl-long.359 https://ink.library.smu.edu.sg/context/sis_research/article/8451/viewcontent/2021.acl_long.359.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Databases and Information Systems Graphics and Human Computer Interfaces
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Databases and Information Systems
Graphics and Human Computer Interfaces
spellingShingle Databases and Information Systems
Graphics and Human Computer Interfaces
HU, Zikun
CAO, Yixin
HUANG, Lifu
CHUA, Tat-Seng
How knowledge graph and attention help? A qualitative analysis into bag-level relation extraction
description Knowledge Graph (KG) and attention mechanism have been demonstrated effective in introducing and selecting useful information for weakly supervised methods. However, only qualitative analysis and ablation study are provided as evidence. In this paper, we contribute a dataset and propose a paradigm to quantitatively evaluate the effect of attention and KG on bag-level relation extraction (RE). We find that (1) higher attention accuracy may lead to worse performance as it may harm the model’s ability to extract entity mention features; (2) the performance of attention is largely influenced by various noise distribution patterns, which is closely related to real-world datasets; (3) KG-enhanced attention indeed improves RE performance, while not through enhanced attention but by incorporating entity prior; and (4) attention mechanism may exacerbate the issue of insufficient training data. Based on these findings, we show that a straightforward variant of RE model can achieve significant improvements (6% AUC on average) on two real-world datasets as compared with three state-of-the-art baselines. Our codes and datasets are available at https://github.com/zigkwin-hu/how-KG-ATT-help.
format text
author HU, Zikun
CAO, Yixin
HUANG, Lifu
CHUA, Tat-Seng
author_facet HU, Zikun
CAO, Yixin
HUANG, Lifu
CHUA, Tat-Seng
author_sort HU, Zikun
title How knowledge graph and attention help? A qualitative analysis into bag-level relation extraction
title_short How knowledge graph and attention help? A qualitative analysis into bag-level relation extraction
title_full How knowledge graph and attention help? A qualitative analysis into bag-level relation extraction
title_fullStr How knowledge graph and attention help? A qualitative analysis into bag-level relation extraction
title_full_unstemmed How knowledge graph and attention help? A qualitative analysis into bag-level relation extraction
title_sort how knowledge graph and attention help? a qualitative analysis into bag-level relation extraction
publisher Institutional Knowledge at Singapore Management University
publishDate 2021
url https://ink.library.smu.edu.sg/sis_research/7448
https://ink.library.smu.edu.sg/context/sis_research/article/8451/viewcontent/2021.acl_long.359.pdf
_version_ 1770576340474396672