Prototypical graph contrastive learning

Graph-level representations are critical in various real-world applications, such as predicting the properties of molecules. But in practice, precise graph annotations are generally very expensive and time-consuming. To address this issue, graph contrastive learning constructs instance discriminatio...

Full description

Saved in:
Bibliographic Details
Main Authors: LIN, Shuai, LIU, Chen, ZHOU, Pan, HU, Zi-Yuan, WANG, Shuojia, ZHAO, Ruihui, ZHENG, Yefeng, LIN, Liang, XING, Eric, LIANG, Xiaodan
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2022
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/9055
https://ink.library.smu.edu.sg/context/sis_research/article/10058/viewcontent/2022_TNNLS_PGCL.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-10058
record_format dspace
spelling sg-smu-ink.sis_research-100582024-08-01T15:37:15Z Prototypical graph contrastive learning LIN, Shuai LIU, Chen ZHOU, Pan HU, Zi-Yuan WANG, Shuojia ZHAO, Ruihui ZHENG, Yefeng LIN, Liang XING, Eric LIANG, Xiaodan Graph-level representations are critical in various real-world applications, such as predicting the properties of molecules. But in practice, precise graph annotations are generally very expensive and time-consuming. To address this issue, graph contrastive learning constructs instance discrimination task which pulls together positive pairs (augmentation pairs of the same graph) and pushes away negative pairs (augmentation pairs of different graphs) for unsupervised representation learning. However, since for a query, its negatives are uniformly sampled from all graphs, existing methods suffer from the critical sampling bias issue, i.e., the negatives likely having the same semantic structure with the query, leading to performance degradation. To mitigate this sampling bias issue, in this paper, we propose a Prototypical Graph Contrastive Learning (PGCL) approach. Specifically, PGCL models the underlying semantic structure of the graph data via clustering semantically similar graphs into the same group, and simultaneously encourages the clustering consistency for different augmentations of the same graph. Then given a query, it performs negative sampling via drawing the graphs from those clusters that differ from the cluster of query, which ensures the semantic difference between query and its negative samples. Moreover, for a query, PGCL further reweights its negative samples based on the distance between their prototypes (cluster centroids) and the query prototype such that those negatives having moderate prototype distance enjoy relatively large weights. This reweighting strategy is proved to be more effective than uniform sampling. Experimental results on various graph benchmarks testify the advantages of our PGCL over state-of-the-art methods. Code is publicly available at https://github.com/ha-lins/PGCL. 2022-07-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/9055 info:doi/10.1109/TNNLS.2022.3191086 https://ink.library.smu.edu.sg/context/sis_research/article/10058/viewcontent/2022_TNNLS_PGCL.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Contrastive learning Self-supervised learning Graph representation learning Graphics and Human Computer Interfaces
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Contrastive learning
Self-supervised learning
Graph representation learning
Graphics and Human Computer Interfaces
spellingShingle Contrastive learning
Self-supervised learning
Graph representation learning
Graphics and Human Computer Interfaces
LIN, Shuai
LIU, Chen
ZHOU, Pan
HU, Zi-Yuan
WANG, Shuojia
ZHAO, Ruihui
ZHENG, Yefeng
LIN, Liang
XING, Eric
LIANG, Xiaodan
Prototypical graph contrastive learning
description Graph-level representations are critical in various real-world applications, such as predicting the properties of molecules. But in practice, precise graph annotations are generally very expensive and time-consuming. To address this issue, graph contrastive learning constructs instance discrimination task which pulls together positive pairs (augmentation pairs of the same graph) and pushes away negative pairs (augmentation pairs of different graphs) for unsupervised representation learning. However, since for a query, its negatives are uniformly sampled from all graphs, existing methods suffer from the critical sampling bias issue, i.e., the negatives likely having the same semantic structure with the query, leading to performance degradation. To mitigate this sampling bias issue, in this paper, we propose a Prototypical Graph Contrastive Learning (PGCL) approach. Specifically, PGCL models the underlying semantic structure of the graph data via clustering semantically similar graphs into the same group, and simultaneously encourages the clustering consistency for different augmentations of the same graph. Then given a query, it performs negative sampling via drawing the graphs from those clusters that differ from the cluster of query, which ensures the semantic difference between query and its negative samples. Moreover, for a query, PGCL further reweights its negative samples based on the distance between their prototypes (cluster centroids) and the query prototype such that those negatives having moderate prototype distance enjoy relatively large weights. This reweighting strategy is proved to be more effective than uniform sampling. Experimental results on various graph benchmarks testify the advantages of our PGCL over state-of-the-art methods. Code is publicly available at https://github.com/ha-lins/PGCL.
format text
author LIN, Shuai
LIU, Chen
ZHOU, Pan
HU, Zi-Yuan
WANG, Shuojia
ZHAO, Ruihui
ZHENG, Yefeng
LIN, Liang
XING, Eric
LIANG, Xiaodan
author_facet LIN, Shuai
LIU, Chen
ZHOU, Pan
HU, Zi-Yuan
WANG, Shuojia
ZHAO, Ruihui
ZHENG, Yefeng
LIN, Liang
XING, Eric
LIANG, Xiaodan
author_sort LIN, Shuai
title Prototypical graph contrastive learning
title_short Prototypical graph contrastive learning
title_full Prototypical graph contrastive learning
title_fullStr Prototypical graph contrastive learning
title_full_unstemmed Prototypical graph contrastive learning
title_sort prototypical graph contrastive learning
publisher Institutional Knowledge at Singapore Management University
publishDate 2022
url https://ink.library.smu.edu.sg/sis_research/9055
https://ink.library.smu.edu.sg/context/sis_research/article/10058/viewcontent/2022_TNNLS_PGCL.pdf
_version_ 1814047719028686848