Sharper generalisation bounds for pairwise learning
Pairwise learning refers to learning tasks with loss functions depending on a pair of training examples, which includes ranking and metric learning as specific examples. Recently, there has been an increasing amount of attention on the generalization analysis of pairwise learning to understand its p...
Saved in:
Main Authors: | , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2020
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/7208 https://ink.library.smu.edu.sg/context/sis_research/article/8211/viewcontent/NeurIPS_2020_sharper_generalization_bounds_for_pairwise_learning_Paper.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
id |
sg-smu-ink.sis_research-8211 |
---|---|
record_format |
dspace |
spelling |
sg-smu-ink.sis_research-82112022-08-04T08:45:52Z Sharper generalisation bounds for pairwise learning LEI, Yunwen LEDENT, Antoine KLOFT, Marius Pairwise learning refers to learning tasks with loss functions depending on a pair of training examples, which includes ranking and metric learning as specific examples. Recently, there has been an increasing amount of attention on the generalization analysis of pairwise learning to understand its practical behavior. However, the existing stability analysis provides suboptimal high-probability generalization bounds. In this paper, we provide a refined stability analysis by developing generalization bounds which can be √nn-times faster than the existing results, where nn is the sample size. This implies excess risk bounds of the order O(n−1/2) (up to a logarithmic factor) for both regularized risk minimization and stochastic gradient descent. We also introduce a new on-average stability measure to develop optimistic bounds in a low noise setting. We apply our results to ranking and metric learning, and clearly show the advantage of our generalization bounds over the existing analysis. 2020-12-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/7208 https://ink.library.smu.edu.sg/context/sis_research/article/8211/viewcontent/NeurIPS_2020_sharper_generalization_bounds_for_pairwise_learning_Paper.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Pairwise learning Stochastic Gradient Descent Stability analysis Learning theory Databases and Information Systems Graphics and Human Computer Interfaces |
institution |
Singapore Management University |
building |
SMU Libraries |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
SMU Libraries |
collection |
InK@SMU |
language |
English |
topic |
Pairwise learning Stochastic Gradient Descent Stability analysis Learning theory Databases and Information Systems Graphics and Human Computer Interfaces |
spellingShingle |
Pairwise learning Stochastic Gradient Descent Stability analysis Learning theory Databases and Information Systems Graphics and Human Computer Interfaces LEI, Yunwen LEDENT, Antoine KLOFT, Marius Sharper generalisation bounds for pairwise learning |
description |
Pairwise learning refers to learning tasks with loss functions depending on a pair of training examples, which includes ranking and metric learning as specific examples. Recently, there has been an increasing amount of attention on the generalization analysis of pairwise learning to understand its practical behavior. However, the existing stability analysis provides suboptimal high-probability generalization bounds. In this paper, we provide a refined stability analysis by developing generalization bounds which can be √nn-times faster than the existing results, where nn is the sample size. This implies excess risk bounds of the order O(n−1/2) (up to a logarithmic factor) for both regularized risk minimization and stochastic gradient descent. We also introduce a new on-average stability measure to develop optimistic bounds in a low noise setting. We apply our results to ranking and metric learning, and clearly show the advantage of our generalization bounds over the existing analysis. |
format |
text |
author |
LEI, Yunwen LEDENT, Antoine KLOFT, Marius |
author_facet |
LEI, Yunwen LEDENT, Antoine KLOFT, Marius |
author_sort |
LEI, Yunwen |
title |
Sharper generalisation bounds for pairwise learning |
title_short |
Sharper generalisation bounds for pairwise learning |
title_full |
Sharper generalisation bounds for pairwise learning |
title_fullStr |
Sharper generalisation bounds for pairwise learning |
title_full_unstemmed |
Sharper generalisation bounds for pairwise learning |
title_sort |
sharper generalisation bounds for pairwise learning |
publisher |
Institutional Knowledge at Singapore Management University |
publishDate |
2020 |
url |
https://ink.library.smu.edu.sg/sis_research/7208 https://ink.library.smu.edu.sg/context/sis_research/article/8211/viewcontent/NeurIPS_2020_sharper_generalization_bounds_for_pairwise_learning_Paper.pdf |
_version_ |
1770576270211416064 |