Fine-grained analysis of structured output prediction

In machine learning we often encounter structured output prediction problems (SOPPs), i.e. problems where the output space admits a rich internal structure. Application domains where SOPPs naturally occur include natural language processing, speech recognition, and computer vision. Typical SOPPs hav...

Full description

Saved in:
Bibliographic Details
Main Authors: MUSTAFA, Waleed, LEI, Yunwen, LEDENT, Antoine, and KLOFT, Marius
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2021
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/7207
https://ink.library.smu.edu.sg/context/sis_research/article/8210/viewcontent/Structured_output.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-8210
record_format dspace
spelling sg-smu-ink.sis_research-82102022-08-04T08:46:17Z Fine-grained analysis of structured output prediction MUSTAFA, Waleed LEI, Yunwen LEDENT, Antoine and KLOFT, Marius In machine learning we often encounter structured output prediction problems (SOPPs), i.e. problems where the output space admits a rich internal structure. Application domains where SOPPs naturally occur include natural language processing, speech recognition, and computer vision. Typical SOPPs have an extremely large label set, which grows exponentially as a function of the size of the output. Existing generalization analysis implies generalization bounds with at least a square-root dependency on the cardinality d of the label set, which can be vacuous in practice. In this paper, we significantly improve the state of the art by developing novel high-probability bounds with a logarithmic dependency on d. Furthermore, we leverage the lens of algorithmic stability to develop generalization bounds in expectation without any dependency on d. Our results therefore build a solid theoretical foundation for learning in large-scale SOPPs. Furthermore, we extend our results to learning with weakly dependent data. 2021-08-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/7207 info:doi/10.24963/ijcai.2021/391 https://ink.library.smu.edu.sg/context/sis_research/article/8210/viewcontent/Structured_output.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Structured output prediction multi-label Neural Networks Multi-class sequence-to-sequence Stochastic Gradient Descent Artificial Intelligence and Robotics Graphics and Human Computer Interfaces
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Structured output prediction
multi-label
Neural Networks
Multi-class
sequence-to-sequence
Stochastic Gradient Descent
Artificial Intelligence and Robotics
Graphics and Human Computer Interfaces
spellingShingle Structured output prediction
multi-label
Neural Networks
Multi-class
sequence-to-sequence
Stochastic Gradient Descent
Artificial Intelligence and Robotics
Graphics and Human Computer Interfaces
MUSTAFA, Waleed
LEI, Yunwen
LEDENT, Antoine
and KLOFT, Marius
Fine-grained analysis of structured output prediction
description In machine learning we often encounter structured output prediction problems (SOPPs), i.e. problems where the output space admits a rich internal structure. Application domains where SOPPs naturally occur include natural language processing, speech recognition, and computer vision. Typical SOPPs have an extremely large label set, which grows exponentially as a function of the size of the output. Existing generalization analysis implies generalization bounds with at least a square-root dependency on the cardinality d of the label set, which can be vacuous in practice. In this paper, we significantly improve the state of the art by developing novel high-probability bounds with a logarithmic dependency on d. Furthermore, we leverage the lens of algorithmic stability to develop generalization bounds in expectation without any dependency on d. Our results therefore build a solid theoretical foundation for learning in large-scale SOPPs. Furthermore, we extend our results to learning with weakly dependent data.
format text
author MUSTAFA, Waleed
LEI, Yunwen
LEDENT, Antoine
and KLOFT, Marius
author_facet MUSTAFA, Waleed
LEI, Yunwen
LEDENT, Antoine
and KLOFT, Marius
author_sort MUSTAFA, Waleed
title Fine-grained analysis of structured output prediction
title_short Fine-grained analysis of structured output prediction
title_full Fine-grained analysis of structured output prediction
title_fullStr Fine-grained analysis of structured output prediction
title_full_unstemmed Fine-grained analysis of structured output prediction
title_sort fine-grained analysis of structured output prediction
publisher Institutional Knowledge at Singapore Management University
publishDate 2021
url https://ink.library.smu.edu.sg/sis_research/7207
https://ink.library.smu.edu.sg/context/sis_research/article/8210/viewcontent/Structured_output.pdf
_version_ 1770576270022672384