Semi supervised learning with graph convolutional networks
Deep learning has achieved unprecedented performances on a broad range of problems involving data in the euclidean space such as 2-D images in object recognition and 1-D paragraphs of text in machine translation. The availability of new datasets in the non-euclidean domain, such as social networks a...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
2019
|
Subjects: | |
Online Access: | http://hdl.handle.net/10356/76922 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-76922 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-769222023-03-03T20:42:22Z Semi supervised learning with graph convolutional networks Ong, Jia Rui Xavier Bresson School of Computer Science and Engineering DRNTU::Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence Deep learning has achieved unprecedented performances on a broad range of problems involving data in the euclidean space such as 2-D images in object recognition and 1-D paragraphs of text in machine translation. The availability of new datasets in the non-euclidean domain, such as social networks and 3D point clouds, have spurred recent efforts in generalising deep neural networks to graphs. In this report, we present the first comparative study between Graph Convolutional Networks (GCNs), Residual Gated Graph ConvNets (RGGCNs) and Graph Attention Networks (GATs), on two fundamental tasks in network science, semi-supervised classification and semi-supervised clustering, to analyse their experimental performances. We improve the existing capabilities of GATs by increasing the number of graph attention layers, and RGGCNs by reducing the number of learnable parameters together with the use of edge gate normalization. We introduce edge dropin, a novel method for regularizing graphs through the addition of edge-level noise. Our final RGGCN and GAT models are within 1% and 5% of GCN’s and RGGCN’s test accuracy on the Cora and semi-supervised clustering dataset generated with the stochastic block model respectively. Bachelor of Engineering (Computer Science) 2019-04-24T03:06:39Z 2019-04-24T03:06:39Z 2019 Final Year Project (FYP) http://hdl.handle.net/10356/76922 en Nanyang Technological University 24 p. application/pdf |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
DRNTU::Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence |
spellingShingle |
DRNTU::Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence Ong, Jia Rui Semi supervised learning with graph convolutional networks |
description |
Deep learning has achieved unprecedented performances on a broad range of problems involving data in the euclidean space such as 2-D images in object recognition and 1-D paragraphs of text in machine translation. The availability of new datasets in the non-euclidean domain, such as social networks and 3D point clouds, have spurred recent efforts in generalising deep neural networks to graphs. In this report, we present the first comparative study between Graph Convolutional Networks (GCNs), Residual Gated Graph ConvNets (RGGCNs) and Graph Attention Networks (GATs), on two fundamental tasks in network science, semi-supervised classification and semi-supervised clustering, to analyse their experimental performances. We improve the existing capabilities of GATs by increasing the number of graph attention layers, and RGGCNs by reducing the number of learnable parameters together with the use of edge gate normalization. We introduce edge dropin, a novel method for regularizing graphs through the addition of edge-level noise. Our final RGGCN and GAT models are within 1% and 5% of GCN’s and RGGCN’s test accuracy on the Cora and semi-supervised clustering dataset generated with the stochastic block model respectively. |
author2 |
Xavier Bresson |
author_facet |
Xavier Bresson Ong, Jia Rui |
format |
Final Year Project |
author |
Ong, Jia Rui |
author_sort |
Ong, Jia Rui |
title |
Semi supervised learning with graph convolutional networks |
title_short |
Semi supervised learning with graph convolutional networks |
title_full |
Semi supervised learning with graph convolutional networks |
title_fullStr |
Semi supervised learning with graph convolutional networks |
title_full_unstemmed |
Semi supervised learning with graph convolutional networks |
title_sort |
semi supervised learning with graph convolutional networks |
publishDate |
2019 |
url |
http://hdl.handle.net/10356/76922 |
_version_ |
1759857512968880128 |