Graph attention networks and approximate personalized propagation of neural prediction models for unsupervised graph representation learning

Recent years have brought progress in the graph machine learning space, with the unsupervised graph representation learning field gaining traction due to the immense resources required to label graph data. A leading approach in the field, Deep Graph InfoMax, has been shown to provide good perform...

Full description

Saved in:
Bibliographic Details
Main Author: Bharadwaja, Tanay
Other Authors: Ke Yiping, Kelly
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2022
Subjects:
Online Access:https://hdl.handle.net/10356/156556
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-156556
record_format dspace
spelling sg-ntu-dr.10356-1565562022-04-20T01:47:56Z Graph attention networks and approximate personalized propagation of neural prediction models for unsupervised graph representation learning Bharadwaja, Tanay Ke Yiping, Kelly School of Computer Science and Engineering ypke@ntu.edu.sg Engineering::Computer science and engineering Recent years have brought progress in the graph machine learning space, with the unsupervised graph representation learning field gaining traction due to the immense resources required to label graph data. A leading approach in the field, Deep Graph InfoMax, has been shown to provide good performance in training Graph Convolutional Networks (GCNs) for the task in an unsupervised manner suing mutual information. In this paper, we proposed the novel approach of using Graph Attention Networks (GATs) and Approximate Personalized Propagation of Neural Prediction (APPNP) models trained with the Deep Graph InfoMax training method. We tested the transductively trained models on three challenging graph benchmarks and used a small training sample along with a Logistic Regression classifier to evaluate the quality of the representations generated. GAT models showed good performance and were able to attain a similar accuracy to GCN-based approaches. However, APPNP models were not able to learn well from the Deep Graph InfoMax training method, with lacklustre performance. The success of the GAT models solidifies the theory behind the training method, and we suggest that more developments on GAT variants suited to Deep Graph InfoMax be done to bring better learning through mutual information. On the other hand, the APPNP models require further improvements to be trained with mutual information for arbitrary graphs. Increased computing power to tackle larger benchmarks would also prove to be useful for the graph representation learning task. Bachelor of Science in Data Science and Artificial Intelligence 2022-04-20T01:47:56Z 2022-04-20T01:47:56Z 2022 Final Year Project (FYP) Bharadwaja, T. (2022). Graph attention networks and approximate personalized propagation of neural prediction models for unsupervised graph representation learning. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/156556 https://hdl.handle.net/10356/156556 en SCSE21-0377 application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Computer science and engineering
spellingShingle Engineering::Computer science and engineering
Bharadwaja, Tanay
Graph attention networks and approximate personalized propagation of neural prediction models for unsupervised graph representation learning
description Recent years have brought progress in the graph machine learning space, with the unsupervised graph representation learning field gaining traction due to the immense resources required to label graph data. A leading approach in the field, Deep Graph InfoMax, has been shown to provide good performance in training Graph Convolutional Networks (GCNs) for the task in an unsupervised manner suing mutual information. In this paper, we proposed the novel approach of using Graph Attention Networks (GATs) and Approximate Personalized Propagation of Neural Prediction (APPNP) models trained with the Deep Graph InfoMax training method. We tested the transductively trained models on three challenging graph benchmarks and used a small training sample along with a Logistic Regression classifier to evaluate the quality of the representations generated. GAT models showed good performance and were able to attain a similar accuracy to GCN-based approaches. However, APPNP models were not able to learn well from the Deep Graph InfoMax training method, with lacklustre performance. The success of the GAT models solidifies the theory behind the training method, and we suggest that more developments on GAT variants suited to Deep Graph InfoMax be done to bring better learning through mutual information. On the other hand, the APPNP models require further improvements to be trained with mutual information for arbitrary graphs. Increased computing power to tackle larger benchmarks would also prove to be useful for the graph representation learning task.
author2 Ke Yiping, Kelly
author_facet Ke Yiping, Kelly
Bharadwaja, Tanay
format Final Year Project
author Bharadwaja, Tanay
author_sort Bharadwaja, Tanay
title Graph attention networks and approximate personalized propagation of neural prediction models for unsupervised graph representation learning
title_short Graph attention networks and approximate personalized propagation of neural prediction models for unsupervised graph representation learning
title_full Graph attention networks and approximate personalized propagation of neural prediction models for unsupervised graph representation learning
title_fullStr Graph attention networks and approximate personalized propagation of neural prediction models for unsupervised graph representation learning
title_full_unstemmed Graph attention networks and approximate personalized propagation of neural prediction models for unsupervised graph representation learning
title_sort graph attention networks and approximate personalized propagation of neural prediction models for unsupervised graph representation learning
publisher Nanyang Technological University
publishDate 2022
url https://hdl.handle.net/10356/156556
_version_ 1731235736006426624