Contrastive pre-training of GNNs on heterogeneous graphs
While graph neural networks (GNNs) emerge as the state-of-the-art representation learning methods on graphs, they often require a large amount of labeled data to achieve satisfactory performance, which is often expensive or unavailable. To relieve the label scarcity issue, some pre-training strategi...
Saved in:
Main Authors: | , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2021
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/6889 https://ink.library.smu.edu.sg/context/sis_research/article/7892/viewcontent/124.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-7892 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-78922023-07-11T15:29:29Z Contrastive pre-training of GNNs on heterogeneous graphs JIANG, Xunqiang LU, Yuanfu FANG, Yuan SHI, Chuan While graph neural networks (GNNs) emerge as the state-of-the-art representation learning methods on graphs, they often require a large amount of labeled data to achieve satisfactory performance, which is often expensive or unavailable. To relieve the label scarcity issue, some pre-training strategies have been devised for GNNs, to learn transferable knowledge from the universal structural properties of the graph. However, existing pre-training strategies are only designed for homogeneous graphs, in which each node and edge belongs to the same type. In contrast, a heterogeneous graph embodies rich semantics, as multiple types of nodes interact with each other via different kinds of edges, which are neglected by existing strategies. In this paper, we propose a novel Contrastive Pre-Training strategy of GNNs on Heterogeneous Graphs (CPT-HG), to capture both the semantic and structural properties in a self-supervised manner. Specifically, we design semantic-aware pre-training tasks at both the relation- and subgraph-levels, and further enhance their representativeness by employing contrastive learning. We conduct extensive experiments on three real-world heterogeneous graphs, and promising results demonstrate the superior ability of our CPT-HG to transfer knowledge to various downstream tasks via pre-training. 2021-11-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/6889 info:doi/10.1145/3459637.3482332 https://ink.library.smu.edu.sg/context/sis_research/article/7892/viewcontent/124.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Pre-training Heterogeneous graph Self-supervised learning Databases and Information Systems |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Pre-training Heterogeneous graph Self-supervised learning Databases and Information Systems |
spellingShingle |
Pre-training Heterogeneous graph Self-supervised learning Databases and Information Systems JIANG, Xunqiang LU, Yuanfu FANG, Yuan SHI, Chuan Contrastive pre-training of GNNs on heterogeneous graphs |
description |
While graph neural networks (GNNs) emerge as the state-of-the-art representation learning methods on graphs, they often require a large amount of labeled data to achieve satisfactory performance, which is often expensive or unavailable. To relieve the label scarcity issue, some pre-training strategies have been devised for GNNs, to learn transferable knowledge from the universal structural properties of the graph. However, existing pre-training strategies are only designed for homogeneous graphs, in which each node and edge belongs to the same type. In contrast, a heterogeneous graph embodies rich semantics, as multiple types of nodes interact with each other via different kinds of edges, which are neglected by existing strategies. In this paper, we propose a novel Contrastive Pre-Training strategy of GNNs on Heterogeneous Graphs (CPT-HG), to capture both the semantic and structural properties in a self-supervised manner. Specifically, we design semantic-aware pre-training tasks at both the relation- and subgraph-levels, and further enhance their representativeness by employing contrastive learning. We conduct extensive experiments on three real-world heterogeneous graphs, and promising results demonstrate the superior ability of our CPT-HG to transfer knowledge to various downstream tasks via pre-training. |
format |
text |
author |
JIANG, Xunqiang LU, Yuanfu FANG, Yuan SHI, Chuan |
author_facet |
JIANG, Xunqiang LU, Yuanfu FANG, Yuan SHI, Chuan |
author_sort |
JIANG, Xunqiang |
title |
Contrastive pre-training of GNNs on heterogeneous graphs |
title_short |
Contrastive pre-training of GNNs on heterogeneous graphs |
title_full |
Contrastive pre-training of GNNs on heterogeneous graphs |
title_fullStr |
Contrastive pre-training of GNNs on heterogeneous graphs |
title_full_unstemmed |
Contrastive pre-training of GNNs on heterogeneous graphs |
title_sort |
contrastive pre-training of gnns on heterogeneous graphs |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2021 |
url |
https://ink.library.smu.edu.sg/sis_research/6889 https://ink.library.smu.edu.sg/context/sis_research/article/7892/viewcontent/124.pdf |
_version_ |
1772829215190679552 |