Do pre-trained models benefit knowledge graph completion? A reliable evaluation and a reasonable approach
In recent years, pre-trained language models (PLMs) have been shown to capture factual knowledge from massive texts, which encourages the proposal of PLM-based knowledge graph completion (KGC) models. However, these models are still quite behind the SOTA KGC models in terms of performance. In this w...
Saved in:
Main Authors: | LV, Xin, LIN, Yankai, CAO, Yixin, HOU, Lei, LI, Juanzi, LIU, Zhiyuan, LI, Peng, ZHOU, Jie |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2022
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/7446 https://ink.library.smu.edu.sg/context/sis_research/article/8449/viewcontent/2022.findings_acl.282.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Are missing links predictable? An inferential benchmark for knowledge graph completion
by: CAO, Yixin, et al.
Published: (2021) -
Neural collective entity linking
by: CAO, Yixin, et al.
Published: (2018) -
Multi-channel graph neural network for entity alignment
by: CAO, Yixin, et al.
Published: (2019) -
Semi-supervised entity alignment via joint knowledge embedding model and cross-graph model
by: LI, Chengjiang, et al.
Published: (2019) -
Explainable reasoning over knowledge graphs for recommendation
by: WANG, Xiang, et al.
Published: (2019)