Attention graph neural network on heterogeneous information network

Graph Neural Network(GNN)is a kind of powerful deep learning network to analyse graph information. There are two types of graphs: Homogeneous Information Network and Heterogenous Information Network (HIN). In this project, I am focus on researching the HIN which contains multipletypes of...

Full description

Saved in:
Bibliographic Details
Main Author: Wang, Kexin
Other Authors: -
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2020
Subjects:
Online Access:https://hdl.handle.net/10356/139795
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:Graph Neural Network(GNN)is a kind of powerful deep learning network to analyse graph information. There are two types of graphs: Homogeneous Information Network and Heterogenous Information Network (HIN). In this project, I am focus on researching the HIN which contains multipletypes of nodes and links in a graph. I will be studying several different GNN models including Metapath2vec[1], GraphSAGE[2], GCN[3], GAT[4], HAN[5]. They use different mechanismsto extract node embeddings and perform classifications. Generally speaking,in GNNeachnode’s embedding is extracted by aggregating feature information from the node’s local neighbourhood.There are different additional mechanism introduced in node representation in HINlike meta-path and attention.In this project, I am mainly studying and analysing the features of HAN model.It studies meta-path mechanismandtwo levels of attention including node-level attention and semantic-level attention. The node-level attention can differentiate the importance of different neighbour nodes while the semantic-level attention can represent the importance of different meta-paths.I carried out several experiments on this HAN model.The experiments showed satisfyingresults and are good practice of the paper theory.Based on the results obtained, result analysisis performed. I analysed the features of the two levels of attention valuesand found out their meaning by using the experiment data.In order to explainan inconsistencyproblem inthe result analysis, I also improved the model by modifying the way calculatingthe semantic level attention values.