Graph neural differential equation networks for improved representation learning and robustness

Graph representation learning distills the complex structures of graphs into tractable, low-dimensional vector spaces, capturing essential topological and attribute-based properties. Graph Neural Networks (GNNs) have become a pivotal tool in this domain, leveraging graph structures to iteratively up...

Full description

Saved in:
Bibliographic Details
Main Author: Zhao, Kai
Other Authors: Tay Wee Peng
Format: Thesis-Doctor of Philosophy
Language:English
Published: Nanyang Technological University 2025
Subjects:
Online Access:https://hdl.handle.net/10356/182340
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-182340
record_format dspace
spelling sg-ntu-dr.10356-1823402025-02-05T01:58:53Z Graph neural differential equation networks for improved representation learning and robustness Zhao, Kai Tay Wee Peng School of Electrical and Electronic Engineering wptay@ntu.edu.sg Engineering Graph neural networks Robustness Machine learning Graph representation learning distills the complex structures of graphs into tractable, low-dimensional vector spaces, capturing essential topological and attribute-based properties. Graph Neural Networks (GNNs) have become a pivotal tool in this domain, leveraging graph structures to iteratively update node representations through neighbor aggregations. These representations support fundamental tasks such as node classification, link prediction, and graph classification, applicable across diverse fields from social networks and biological systems to citation networks. Despite their success, GNNs face critical challenges: they often underperform on heterophilic graph data where connected nodes display dissimilar characteristics, suffer from oversmoothing which impairs performance as network depth increases, and are sensitive to hierarchical structures. Furthermore, they are vulnerable to adversarial attacks that can severely compromise model integrity. This thesis introduces the use of neural differential equations in GNNs to enhance representation learning and robustness, addressing these challenges comprehensively. The adoption of Graph Neural Differential Equation Networks (GDENs) employs a dynamic systems approach to evolve node features over continuous time, thereby enhancing the capacity of GNNs to process and learn from graph-structured data. This method governs node feature propagation through differential equations, enabling more refined control over the learning process compared to conventional methods. The initial contribution of this thesis enhances representation learning on heterophilic graphs through a neural convection-diffusion differential equation. Subsequently, the thesis explores the relationship between stability in dynamical systems and robustness within GDENs. A neural Hamiltonian differential equation model is developed, establishing energy-conservative systems within GNNs to bolster robustness against adversarial attacks. Extending beyond traditional integer-order differential equations, the thesis incorporates fractional calculus through the Fractional-Order Graph Neural Differential Equation Networks (F-GDENs) framework. This approach introduces memory and non-local interactions, boosting the networks' ability to handle hierarchical structures and mitigate oversmoothing. F-GDENs not only integrate seamlessly with existing GDENs to enhance representation learning across various datasets, but also demonstrate tighter output perturbation bounds in scenarios involving input and topology perturbations. Empirical results further validate the superior robustness of F-GDENs models compared to integer-order GDENs. In summary, this thesis advances the robustness and capacity of representation learning through GDENs by innovating with new differential equations and extending to fractional-order derivatives. These advancements establish a solid foundation for future research into robust and adaptive GNN architectures, presenting promising implications for practical applications. Doctor of Philosophy 2025-01-22T07:53:12Z 2025-01-22T07:53:12Z 2024 Thesis-Doctor of Philosophy Zhao, K. (2024). Graph neural differential equation networks for improved representation learning and robustness. Doctoral thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/182340 https://hdl.handle.net/10356/182340 10.32657/10356/182340 en This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY-NC 4.0). application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering
Graph neural networks
Robustness
Machine learning
spellingShingle Engineering
Graph neural networks
Robustness
Machine learning
Zhao, Kai
Graph neural differential equation networks for improved representation learning and robustness
description Graph representation learning distills the complex structures of graphs into tractable, low-dimensional vector spaces, capturing essential topological and attribute-based properties. Graph Neural Networks (GNNs) have become a pivotal tool in this domain, leveraging graph structures to iteratively update node representations through neighbor aggregations. These representations support fundamental tasks such as node classification, link prediction, and graph classification, applicable across diverse fields from social networks and biological systems to citation networks. Despite their success, GNNs face critical challenges: they often underperform on heterophilic graph data where connected nodes display dissimilar characteristics, suffer from oversmoothing which impairs performance as network depth increases, and are sensitive to hierarchical structures. Furthermore, they are vulnerable to adversarial attacks that can severely compromise model integrity. This thesis introduces the use of neural differential equations in GNNs to enhance representation learning and robustness, addressing these challenges comprehensively. The adoption of Graph Neural Differential Equation Networks (GDENs) employs a dynamic systems approach to evolve node features over continuous time, thereby enhancing the capacity of GNNs to process and learn from graph-structured data. This method governs node feature propagation through differential equations, enabling more refined control over the learning process compared to conventional methods. The initial contribution of this thesis enhances representation learning on heterophilic graphs through a neural convection-diffusion differential equation. Subsequently, the thesis explores the relationship between stability in dynamical systems and robustness within GDENs. A neural Hamiltonian differential equation model is developed, establishing energy-conservative systems within GNNs to bolster robustness against adversarial attacks. Extending beyond traditional integer-order differential equations, the thesis incorporates fractional calculus through the Fractional-Order Graph Neural Differential Equation Networks (F-GDENs) framework. This approach introduces memory and non-local interactions, boosting the networks' ability to handle hierarchical structures and mitigate oversmoothing. F-GDENs not only integrate seamlessly with existing GDENs to enhance representation learning across various datasets, but also demonstrate tighter output perturbation bounds in scenarios involving input and topology perturbations. Empirical results further validate the superior robustness of F-GDENs models compared to integer-order GDENs. In summary, this thesis advances the robustness and capacity of representation learning through GDENs by innovating with new differential equations and extending to fractional-order derivatives. These advancements establish a solid foundation for future research into robust and adaptive GNN architectures, presenting promising implications for practical applications.
author2 Tay Wee Peng
author_facet Tay Wee Peng
Zhao, Kai
format Thesis-Doctor of Philosophy
author Zhao, Kai
author_sort Zhao, Kai
title Graph neural differential equation networks for improved representation learning and robustness
title_short Graph neural differential equation networks for improved representation learning and robustness
title_full Graph neural differential equation networks for improved representation learning and robustness
title_fullStr Graph neural differential equation networks for improved representation learning and robustness
title_full_unstemmed Graph neural differential equation networks for improved representation learning and robustness
title_sort graph neural differential equation networks for improved representation learning and robustness
publisher Nanyang Technological University
publishDate 2025
url https://hdl.handle.net/10356/182340
_version_ 1823807353230721024