On the substructure countability of graph neural networks

With the empirical success of Graph Neural Networks (GNNs) on graph-related tasks, it is intriguing to investigate their theoretical power on these tasks. In this paper, we focus on GNNs' theoretical power on substructure counting, a fundamental yet challenging task in many applications. Previo...

Full description

Saved in:
Bibliographic Details
Main Authors: XIA, Wenwen, LI, Yuchen, LI, Shenghong
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2022
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/7623
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-8626
record_format dspace
spelling sg-smu-ink.sis_research-86262022-12-22T02:54:02Z On the substructure countability of graph neural networks XIA, Wenwen LI, Yuchen LI, Shenghong With the empirical success of Graph Neural Networks (GNNs) on graph-related tasks, it is intriguing to investigate their theoretical power on these tasks. In this paper, we focus on GNNs' theoretical power on substructure counting, a fundamental yet challenging task in many applications. Previous works have proven that the 2-dimensional Weisfeiler-Leman algorithm (2-WL) equivalent GNNs can only count limited substructures. However, the substructure counting ability of theoretically more powerful and computationally tractable GNNs remains unclear. In this paper, we study conditions for substructures to be theoretically countable by k-WL equivalent GNNs, and then focus on 3-WL equivalent ones, which are currently the most theoretically powerful instances with practical computational cost. Further, we propose an algorithm to determine the countability of substructures for 3-WL equivalent GNNs. Our results reveal that 3-WL equivalent GNNs can count considerably more substructures than 2-WL equivalent ones. However, the proportion of countable patterns and prediction performance decrease as the pattern size increases. Therefore, we propose a Layer Permutation Pooling (LPP) model for better substructure counting performance. LPP first decomposes the data graph into subgraphs. Then we propose a layer permutation scheme to represent each decomposed subgraph as a set of matrices. Finally, LPP utilizes a neural network to conduct predictions on matrices. We compare LPP with several state-of-the-art GNNs on various datasets. Experimental results show that LPP outperforms baselines by 84% on average with the RMSE metric. 2022-11-23T08:00:00Z text https://ink.library.smu.edu.sg/sis_research/7623 info:doi/10.1109/TKDE.2022.3223471 Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Graphics and Human Computer Interfaces OS and Networks
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Graphics and Human Computer Interfaces
OS and Networks
spellingShingle Graphics and Human Computer Interfaces
OS and Networks
XIA, Wenwen
LI, Yuchen
LI, Shenghong
On the substructure countability of graph neural networks
description With the empirical success of Graph Neural Networks (GNNs) on graph-related tasks, it is intriguing to investigate their theoretical power on these tasks. In this paper, we focus on GNNs' theoretical power on substructure counting, a fundamental yet challenging task in many applications. Previous works have proven that the 2-dimensional Weisfeiler-Leman algorithm (2-WL) equivalent GNNs can only count limited substructures. However, the substructure counting ability of theoretically more powerful and computationally tractable GNNs remains unclear. In this paper, we study conditions for substructures to be theoretically countable by k-WL equivalent GNNs, and then focus on 3-WL equivalent ones, which are currently the most theoretically powerful instances with practical computational cost. Further, we propose an algorithm to determine the countability of substructures for 3-WL equivalent GNNs. Our results reveal that 3-WL equivalent GNNs can count considerably more substructures than 2-WL equivalent ones. However, the proportion of countable patterns and prediction performance decrease as the pattern size increases. Therefore, we propose a Layer Permutation Pooling (LPP) model for better substructure counting performance. LPP first decomposes the data graph into subgraphs. Then we propose a layer permutation scheme to represent each decomposed subgraph as a set of matrices. Finally, LPP utilizes a neural network to conduct predictions on matrices. We compare LPP with several state-of-the-art GNNs on various datasets. Experimental results show that LPP outperforms baselines by 84% on average with the RMSE metric.
format text
author XIA, Wenwen
LI, Yuchen
LI, Shenghong
author_facet XIA, Wenwen
LI, Yuchen
LI, Shenghong
author_sort XIA, Wenwen
title On the substructure countability of graph neural networks
title_short On the substructure countability of graph neural networks
title_full On the substructure countability of graph neural networks
title_fullStr On the substructure countability of graph neural networks
title_full_unstemmed On the substructure countability of graph neural networks
title_sort on the substructure countability of graph neural networks
publisher Institutional Knowledge at Singapore Management University
publishDate 2022
url https://ink.library.smu.edu.sg/sis_research/7623
_version_ 1770576396620398592