Orthogonal inductive matrix completion

We propose orthogonal inductive matrix completion (OMIC), an interpretable approach to matrix completion based on a sum of multiple orthonormal side information terms, together with nuclear-norm regularization. The approach allows us to inject prior knowledge about the singular vectors of the ground...

全面介紹

Saved in:
書目詳細資料
Main Authors: LEDENT, Antoine, ALVES, Rrodrigo, KLOFT, Marius
格式: text
語言:English
出版: Institutional Knowledge at Singapore Management University 2021
主題:
在線閱讀:https://ink.library.smu.edu.sg/sis_research/7197
https://ink.library.smu.edu.sg/context/sis_research/article/8200/viewcontent/2004.01653.pdf
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!
實物特徵
總結:We propose orthogonal inductive matrix completion (OMIC), an interpretable approach to matrix completion based on a sum of multiple orthonormal side information terms, together with nuclear-norm regularization. The approach allows us to inject prior knowledge about the singular vectors of the ground-truth matrix. We optimize the approach by a provably converging algorithm, which optimizes all components of the model simultaneously. We study the generalization capabilities of our method in both the distribution-free setting and in the case where the sampling distribution admits uniform marginals, yielding learning guarantees that improve with the quality of the injected knowledge in both cases. As particular cases of our framework, we present models that can incorporate user and item biases or community information in a joint and additive fashion. We analyze the performance of OMIC on several synthetic and real datasets. On synthetic datasets with a sliding scale of user bias relevance, we show that OMIC better adapts to different regimes than other methods. On real-life datasets containing user/items recommendations and relevant side information, we find that OMIC surpasses the state of the art, with the added benefit of greater interpretability.