導出完成 — 

Do pre-trained models benefit knowledge graph completion? A reliable evaluation and a reasonable approach

In recent years, pre-trained language models (PLMs) have been shown to capture factual knowledge from massive texts, which encourages the proposal of PLM-based knowledge graph completion (KGC) models. However, these models are still quite behind the SOTA KGC models in terms of performance. In this w...

全面介紹

Saved in:
書目詳細資料
Main Authors: LV, Xin, LIN, Yankai, CAO, Yixin, HOU, Lei, LI, Juanzi, LIU, Zhiyuan, LI, Peng, ZHOU, Jie
格式: text
語言:English
出版: Institutional Knowledge at Singapore Management University 2022
主題:
在線閱讀:https://ink.library.smu.edu.sg/sis_research/7446
https://ink.library.smu.edu.sg/context/sis_research/article/8449/viewcontent/2022.findings_acl.282.pdf
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!