Graph neural point process for temporal interaction prediction

Temporal graphs are ubiquitous data structures in many scenarios, including social networks, user-item interaction networks, etc. In this paper, we focus on predicting the exact time of the next interaction, given a node pair on a temporal graph. This novel problem can support interesting applicatio...

Full description

Saved in:
Bibliographic Details
Main Authors: XIA, Wenwen, LI, Yuchen, LI, Shengdong
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2023
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/7547
https://ink.library.smu.edu.sg/context/sis_research/article/8550/viewcontent/09709121.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
Description
Summary:Temporal graphs are ubiquitous data structures in many scenarios, including social networks, user-item interaction networks, etc. In this paper, we focus on predicting the exact time of the next interaction, given a node pair on a temporal graph. This novel problem can support interesting applications, such as time-sensitive items recommendation, congestion prediction on road networks, and many others. We present Graph Neural Point Process (GNPP) to tackle this problem. GNPP relies on the graph neural message passing and the temporal point process framework. Most previous graph neural models only utilize the chronological order of observed events and ignore exact timestamps. In GNPP, we adapt a time encoding scheme to map real-valued timestamps to a high-dimensional vector space so that the temporal information can be precisely captured. Further, GNPP considers the structural information of graphs by conducting message-passing aggregation. The obtained representation defines a conditional intensity function that models events' generation mechanisms to predict future event times. We evaluate this model on synthetic and real-world datasets where it outperforms some recently proposed neural point process models and GNNs. We further conduct ablation comparisons and visualizations to shed some light on the learned model and understand the functionality of important components.