A general view for network embedding as matrix factorization

© 2019 Association for Computing Machinery. We propose a general view that demonstrates the relationship between network embedding approaches and matrix factorization. Unlike previous works that present the equivalence for the approaches from a skip-gram model perspective, we provide a more fundamen...

全面介紹

Saved in:
書目詳細資料
Main Authors: Xin Liu, Tsuyoshi Murata, Kyoung Sook Kim, Chatchawan Kotarasu, Chenyi Zhuang
其他作者: Tokyo Institute of Technology
格式: Conference or Workshop Item
出版: 2020
主題:
在線閱讀:https://repository.li.mahidol.ac.th/handle/123456789/50652
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
機構: Mahidol University
實物特徵
總結:© 2019 Association for Computing Machinery. We propose a general view that demonstrates the relationship between network embedding approaches and matrix factorization. Unlike previous works that present the equivalence for the approaches from a skip-gram model perspective, we provide a more fundamental connection from an optimization (objective function) perspective. We demonstrate that matrix factorization is equivalent to optimizing two objectives: one is for bringing together the embeddings of similar nodes; the other is for separating the embeddings of distant nodes. The matrix to be factorized has a general form: S−β·1. The elements of S indicate pairwise node similarities. They can be based on any user-defined similarity/distance measure or learned from random walks on networks. The shift number β is related to a parameter that balances the two objectives. More importantly, the resulting embeddings are sensitive to β and we can improve the embeddings by tuning β. Experiments show that matrix factorization based on a new proposed similarity measure and β-tuning strategy significantly outperforms existing matrix factorization approaches on a range of benchmark networks.