Contextualized graph attention network for recommendation with item knowledge graph

Graph neural networks (GNN) have recently been applied to exploit knowledge graph (KG) for recommendation. Existing GNN-based methods explicitly model the dependency between an entity and its local graph context in KG (i.e., the set of its first-order neighbors), but may not be effective in capturin...

全面介紹

Saved in:
書目詳細資料
Main Authors: Liu, Yong, Yang, Susen, Xu, Yonghui, Miao, Chunyan, Wu, Min, Zhang, Juyong
其他作者: School of Computer Science and Engineering
格式: Article
語言:English
出版: 2022
主題:
在線閱讀:https://hdl.handle.net/10356/156036
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:Graph neural networks (GNN) have recently been applied to exploit knowledge graph (KG) for recommendation. Existing GNN-based methods explicitly model the dependency between an entity and its local graph context in KG (i.e., the set of its first-order neighbors), but may not be effective in capturing its non-local graph context (i.e., the set of most related high-order neighbors). In this paper, we propose a novel recommendation framework, named Contextualized Graph Attention Network (CGAT), which can explicitly exploit both local and non-local graph context information of an entity in KG. More specifically, CGAT captures the local context information by a user-specific graph attention mechanism, considering a user's personalized preferences on entities. In addition, CGAT employs a biased random walk sampling process to extract the non-local context of an entity, and utilizes a Recurrent Neural Network (RNN) to model the dependency between the entity and its non-local contextual entities. To capture the user's personalized preferences on items, an item-specific attention mechanism is also developed to model the dependency between a target item and the contextual items extracted from the user's historical behaviors. We compared CGAT with state-of-the-art KG-based recommendation methods on real datasets, and the experimental results demonstrate the effectiveness of CGAT.