HGPrompt: Bridging homogeneous and heterogeneous graphs for few-shot prompt learning

Graph neural networks (GNNs) and heterogeneous graph neural networks (HGNNs) are prominent techniques for homogeneous and heterogeneous graph representation learning, yet their performance in an end-to-end supervised framework greatly depends on the availability of task-specific supervision. To redu...

Full description

Saved in:
Bibliographic Details
Main Authors: YU, Xingtong, FANG, Yuan, LIU, Zemin, ZHANG, Xinming
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2024
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8712
https://ink.library.smu.edu.sg/context/sis_research/article/9715/viewcontent/29596_Article_Text_33650_1_2_20240324.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-9715
record_format dspace
spelling sg-smu-ink.sis_research-97152024-04-04T08:53:07Z HGPrompt: Bridging homogeneous and heterogeneous graphs for few-shot prompt learning YU, Xingtong FANG, Yuan LIU, Zemin ZHANG, Xinming Graph neural networks (GNNs) and heterogeneous graph neural networks (HGNNs) are prominent techniques for homogeneous and heterogeneous graph representation learning, yet their performance in an end-to-end supervised framework greatly depends on the availability of task-specific supervision. To reduce the labeling cost, pre-training on selfsupervised pretext tasks has become a popular paradigm, but there is often a gap between the pre-trained model and downstream tasks, stemming from the divergence in their objectives. To bridge the gap, prompt learning has risen as a promising direction especially in few-shot settings, without the need to fully fine-tune the pre-trained model. While there has been some early exploration of prompt-based learning on graphs, they primarily deal with homogeneous graphs, ignoring the heterogeneous graphs that are prevalent in downstream applications. In this paper, we propose HGPROMPT, a novel pre-training and prompting framework to unify not only pre-training and downstream tasks but also homogeneous and heterogeneous graphs via a dual-template design. Moreover, we propose dual-prompt in HGPROMPT to assist a downstream task in locating the most relevant prior to bridge the gaps caused by not only feature variations but also heterogeneity differences across tasks. Finally, we thoroughly evaluate and analyze HGPROMPT through extensive experiments on three public datasets. 2024-02-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/8712 https://ink.library.smu.edu.sg/context/sis_research/article/9715/viewcontent/29596_Article_Text_33650_1_2_20240324.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Databases and Information Systems Graphics and Human Computer Interfaces
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Databases and Information Systems
Graphics and Human Computer Interfaces
spellingShingle Databases and Information Systems
Graphics and Human Computer Interfaces
YU, Xingtong
FANG, Yuan
LIU, Zemin
ZHANG, Xinming
HGPrompt: Bridging homogeneous and heterogeneous graphs for few-shot prompt learning
description Graph neural networks (GNNs) and heterogeneous graph neural networks (HGNNs) are prominent techniques for homogeneous and heterogeneous graph representation learning, yet their performance in an end-to-end supervised framework greatly depends on the availability of task-specific supervision. To reduce the labeling cost, pre-training on selfsupervised pretext tasks has become a popular paradigm, but there is often a gap between the pre-trained model and downstream tasks, stemming from the divergence in their objectives. To bridge the gap, prompt learning has risen as a promising direction especially in few-shot settings, without the need to fully fine-tune the pre-trained model. While there has been some early exploration of prompt-based learning on graphs, they primarily deal with homogeneous graphs, ignoring the heterogeneous graphs that are prevalent in downstream applications. In this paper, we propose HGPROMPT, a novel pre-training and prompting framework to unify not only pre-training and downstream tasks but also homogeneous and heterogeneous graphs via a dual-template design. Moreover, we propose dual-prompt in HGPROMPT to assist a downstream task in locating the most relevant prior to bridge the gaps caused by not only feature variations but also heterogeneity differences across tasks. Finally, we thoroughly evaluate and analyze HGPROMPT through extensive experiments on three public datasets.
format text
author YU, Xingtong
FANG, Yuan
LIU, Zemin
ZHANG, Xinming
author_facet YU, Xingtong
FANG, Yuan
LIU, Zemin
ZHANG, Xinming
author_sort YU, Xingtong
title HGPrompt: Bridging homogeneous and heterogeneous graphs for few-shot prompt learning
title_short HGPrompt: Bridging homogeneous and heterogeneous graphs for few-shot prompt learning
title_full HGPrompt: Bridging homogeneous and heterogeneous graphs for few-shot prompt learning
title_fullStr HGPrompt: Bridging homogeneous and heterogeneous graphs for few-shot prompt learning
title_full_unstemmed HGPrompt: Bridging homogeneous and heterogeneous graphs for few-shot prompt learning
title_sort hgprompt: bridging homogeneous and heterogeneous graphs for few-shot prompt learning
publisher Institutional Knowledge at Singapore Management University
publishDate 2024
url https://ink.library.smu.edu.sg/sis_research/8712
https://ink.library.smu.edu.sg/context/sis_research/article/9715/viewcontent/29596_Article_Text_33650_1_2_20240324.pdf
_version_ 1814047473513005056