Generalization bounds for inductive matrix completion in low-noise settings
We study inductive matrix completion (matrix completion with side information) under an i.i.d. subgaussian noise assumption at a low noise regime, with uniform sampling of the entries. We obtain for the first time generalization bounds with the following three properties: (1) they scale like the sta...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2023
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/7951 https://ink.library.smu.edu.sg/context/sis_research/article/8954/viewcontent/26018_Article_Text_30081_1_2_20230626.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-8954 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-89542023-08-15T01:21:37Z Generalization bounds for inductive matrix completion in low-noise settings LEDENT, Antoine ALVES, Rodrigo LEI, Yunwen GUERMEUR, Yann KLOFT, Marius We study inductive matrix completion (matrix completion with side information) under an i.i.d. subgaussian noise assumption at a low noise regime, with uniform sampling of the entries. We obtain for the first time generalization bounds with the following three properties: (1) they scale like the standard deviation of the noise and in particular approach zero in the exact recovery case; (2) even in the presence of noise, they converge to zero when the sample size approaches infinity; and (3) for a fixed dimension of the side information, they only have a logarithmic dependence on the size of the matrix. Differently from many works in approximate recovery, we present results both for bounded Lipschitz losses and for the absolute loss, with the latter relying on Talagrand-type inequalities. The proofs create a bridge between two approaches to the theoretical analysis of matrix completion, since they consist in a combination of techniques from both the exact recovery literature and the approximate recovery literature. 2023-02-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/7951 info:doi/10.1609/aaai.v37i7.26018 https://ink.library.smu.edu.sg/context/sis_research/article/8954/viewcontent/26018_Article_Text_30081_1_2_20230626.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Recommender Systems Matrix Completion Learning Theory Artificial Intelligence and Robotics Databases and Information Systems |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Recommender Systems Matrix Completion Learning Theory Artificial Intelligence and Robotics Databases and Information Systems |
spellingShingle |
Recommender Systems Matrix Completion Learning Theory Artificial Intelligence and Robotics Databases and Information Systems LEDENT, Antoine ALVES, Rodrigo LEI, Yunwen GUERMEUR, Yann KLOFT, Marius Generalization bounds for inductive matrix completion in low-noise settings |
description |
We study inductive matrix completion (matrix completion with side information) under an i.i.d. subgaussian noise assumption at a low noise regime, with uniform sampling of the entries. We obtain for the first time generalization bounds with the following three properties: (1) they scale like the standard deviation of the noise and in particular approach zero in the exact recovery case; (2) even in the presence of noise, they converge to zero when the sample size approaches infinity; and (3) for a fixed dimension of the side information, they only have a logarithmic dependence on the size of the matrix. Differently from many works in approximate recovery, we present results both for bounded Lipschitz losses and for the absolute loss, with the latter relying on Talagrand-type inequalities. The proofs create a bridge between two approaches to the theoretical analysis of matrix completion, since they consist in a combination of techniques from both the exact recovery literature and the approximate recovery literature. |
format |
text |
author |
LEDENT, Antoine ALVES, Rodrigo LEI, Yunwen GUERMEUR, Yann KLOFT, Marius |
author_facet |
LEDENT, Antoine ALVES, Rodrigo LEI, Yunwen GUERMEUR, Yann KLOFT, Marius |
author_sort |
LEDENT, Antoine |
title |
Generalization bounds for inductive matrix completion in low-noise settings |
title_short |
Generalization bounds for inductive matrix completion in low-noise settings |
title_full |
Generalization bounds for inductive matrix completion in low-noise settings |
title_fullStr |
Generalization bounds for inductive matrix completion in low-noise settings |
title_full_unstemmed |
Generalization bounds for inductive matrix completion in low-noise settings |
title_sort |
generalization bounds for inductive matrix completion in low-noise settings |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2023 |
url |
https://ink.library.smu.edu.sg/sis_research/7951 https://ink.library.smu.edu.sg/context/sis_research/article/8954/viewcontent/26018_Article_Text_30081_1_2_20230626.pdf |
_version_ |
1779156904347435008 |