Deep graph-level anomaly detection by glocal knowledge distillation
Graph-level anomaly detection (GAD) describes the problem of detecting graphs that are abnormal in their structure and/or the features of their nodes, as compared to other graphs. One of the challenges in GAD is to devise graph representations that enable the detection of both locally- and globally-...
Saved in:
Main Authors: | , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2022
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/7054 https://ink.library.smu.edu.sg/context/sis_research/article/8057/viewcontent/3488560.3498473.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-8057 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-80572022-04-07T09:06:42Z Deep graph-level anomaly detection by glocal knowledge distillation MA, Rongrong PANG, Guansong CHEN, Ling HENGEL, Anton Van Den Graph-level anomaly detection (GAD) describes the problem of detecting graphs that are abnormal in their structure and/or the features of their nodes, as compared to other graphs. One of the challenges in GAD is to devise graph representations that enable the detection of both locally- and globally-anomalous graphs, i.e., graphs that are abnormal in their fine-grained (node-level) or holistic (graph-level) properties, respectively. To tackle this challenge we introduce a novel deep anomaly detection approach for GAD that learns rich global and local normal pattern information by joint random distillation of graph and node representations. The random distillation is achieved by training one GNN to predict another GNN with randomly initialized network weights. Extensive experiments on 16 real-world graph datasets from diverse domains show that our model significantly outperforms seven state-of-the-art models. Code and datasets are available at https://git.io/GLocalKD. 2022-02-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/7054 info:doi/10.1145/3488560.3498473 https://ink.library.smu.edu.sg/context/sis_research/article/8057/viewcontent/3488560.3498473.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Graph-level anomaly detection Graph neural networks Knowledge distillation Deep learning Artificial Intelligence and Robotics Graphics and Human Computer Interfaces |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Graph-level anomaly detection Graph neural networks Knowledge distillation Deep learning Artificial Intelligence and Robotics Graphics and Human Computer Interfaces |
spellingShingle |
Graph-level anomaly detection Graph neural networks Knowledge distillation Deep learning Artificial Intelligence and Robotics Graphics and Human Computer Interfaces MA, Rongrong PANG, Guansong CHEN, Ling HENGEL, Anton Van Den Deep graph-level anomaly detection by glocal knowledge distillation |
description |
Graph-level anomaly detection (GAD) describes the problem of detecting graphs that are abnormal in their structure and/or the features of their nodes, as compared to other graphs. One of the challenges in GAD is to devise graph representations that enable the detection of both locally- and globally-anomalous graphs, i.e., graphs that are abnormal in their fine-grained (node-level) or holistic (graph-level) properties, respectively. To tackle this challenge we introduce a novel deep anomaly detection approach for GAD that learns rich global and local normal pattern information by joint random distillation of graph and node representations. The random distillation is achieved by training one GNN to predict another GNN with randomly initialized network weights. Extensive experiments on 16 real-world graph datasets from diverse domains show that our model significantly outperforms seven state-of-the-art models. Code and datasets are available at https://git.io/GLocalKD. |
format |
text |
author |
MA, Rongrong PANG, Guansong CHEN, Ling HENGEL, Anton Van Den |
author_facet |
MA, Rongrong PANG, Guansong CHEN, Ling HENGEL, Anton Van Den |
author_sort |
MA, Rongrong |
title |
Deep graph-level anomaly detection by glocal knowledge distillation |
title_short |
Deep graph-level anomaly detection by glocal knowledge distillation |
title_full |
Deep graph-level anomaly detection by glocal knowledge distillation |
title_fullStr |
Deep graph-level anomaly detection by glocal knowledge distillation |
title_full_unstemmed |
Deep graph-level anomaly detection by glocal knowledge distillation |
title_sort |
deep graph-level anomaly detection by glocal knowledge distillation |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2022 |
url |
https://ink.library.smu.edu.sg/sis_research/7054 https://ink.library.smu.edu.sg/context/sis_research/article/8057/viewcontent/3488560.3498473.pdf |
_version_ |
1770576205259472896 |