Graph contrastive learning with stable and scalable spectral encoding

Graph contrastive learning (GCL) aims to learn representations by capturing the agreements between different graph views. Traditional GCL methods generate views in the spatial domain, but it has been recently discovered that the spectral domain also plays a vital role in complementing spatial views....

Full description

Saved in:
Bibliographic Details
Main Authors: BO, Deyu, FANG, Yuan, LIU, Yang, SHI, Chuan
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2023
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8333
https://ink.library.smu.edu.sg/context/sis_research/article/9336/viewcontent/NeruIPS_2023.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-9336
record_format dspace
spelling sg-smu-ink.sis_research-93362023-12-05T02:54:41Z Graph contrastive learning with stable and scalable spectral encoding BO, Deyu FANG, Yuan LIU, Yang SHI, Chuan Graph contrastive learning (GCL) aims to learn representations by capturing the agreements between different graph views. Traditional GCL methods generate views in the spatial domain, but it has been recently discovered that the spectral domain also plays a vital role in complementing spatial views. However, existing spectral-based graph views either ignore the eigenvectors that encode valuable positional information, or suffer from high complexity when trying to address the instability of spectral features. To tackle these challenges, we first design an informative, stable, and scalable spectral encoder, termed EigenMLP, to learn effective representations from the spectral features. Theoretically, EigenMLP is invariant to the rotation and reflection transformations on eigenvectors and robust against perturbations. Then, we propose a spatial-spectral contrastive framework (Sp2GCL) to capture the consistency between the spatial information encoded by graph neural networks and the spectral information learned by EigenMLP, thus effectively fusing these two graph views. Experiments on the node- and graph-level datasets show that our method not only learns effective graph representations but also achieves a 2–10x speedup over other spectral-based methods. 2023-12-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/8333 https://ink.library.smu.edu.sg/context/sis_research/article/9336/viewcontent/NeruIPS_2023.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Graph contrastive learning spectral encoding spatial-spectral graph neural networks Databases and Information Systems Graphics and Human Computer Interfaces
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Graph contrastive learning
spectral encoding
spatial-spectral graph neural networks
Databases and Information Systems
Graphics and Human Computer Interfaces
spellingShingle Graph contrastive learning
spectral encoding
spatial-spectral graph neural networks
Databases and Information Systems
Graphics and Human Computer Interfaces
BO, Deyu
FANG, Yuan
LIU, Yang
SHI, Chuan
Graph contrastive learning with stable and scalable spectral encoding
description Graph contrastive learning (GCL) aims to learn representations by capturing the agreements between different graph views. Traditional GCL methods generate views in the spatial domain, but it has been recently discovered that the spectral domain also plays a vital role in complementing spatial views. However, existing spectral-based graph views either ignore the eigenvectors that encode valuable positional information, or suffer from high complexity when trying to address the instability of spectral features. To tackle these challenges, we first design an informative, stable, and scalable spectral encoder, termed EigenMLP, to learn effective representations from the spectral features. Theoretically, EigenMLP is invariant to the rotation and reflection transformations on eigenvectors and robust against perturbations. Then, we propose a spatial-spectral contrastive framework (Sp2GCL) to capture the consistency between the spatial information encoded by graph neural networks and the spectral information learned by EigenMLP, thus effectively fusing these two graph views. Experiments on the node- and graph-level datasets show that our method not only learns effective graph representations but also achieves a 2–10x speedup over other spectral-based methods.
format text
author BO, Deyu
FANG, Yuan
LIU, Yang
SHI, Chuan
author_facet BO, Deyu
FANG, Yuan
LIU, Yang
SHI, Chuan
author_sort BO, Deyu
title Graph contrastive learning with stable and scalable spectral encoding
title_short Graph contrastive learning with stable and scalable spectral encoding
title_full Graph contrastive learning with stable and scalable spectral encoding
title_fullStr Graph contrastive learning with stable and scalable spectral encoding
title_full_unstemmed Graph contrastive learning with stable and scalable spectral encoding
title_sort graph contrastive learning with stable and scalable spectral encoding
publisher Institutional Knowledge at Singapore Management University
publishDate 2023
url https://ink.library.smu.edu.sg/sis_research/8333
https://ink.library.smu.edu.sg/context/sis_research/article/9336/viewcontent/NeruIPS_2023.pdf
_version_ 1784855637534965760