Class is invariant to context and vice versa: On learning invariance for out-of-distribution generalization

Out-Of-Distribution generalization (OOD) is all about learning invariance against environmental changes. If the context in every class is evenly distributed, OOD would be trivial because the context can be easily removed due to an underlying principle: class is invariant to context. However, collect...

Full description

Saved in:
Bibliographic Details
Main Authors: QI, Jiaxin, TANG, Kaihua, SUN, Qianru, HUA, Xian-Sheng, ZHANG, Hanwang
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2022
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/7514
https://ink.library.smu.edu.sg/context/sis_research/article/8517/viewcontent/ECCV2022__Class_Is_Invariant_to_Context_and_Vice_Versa__On_Learning_Invariance_forOut_Of_Distribution_Generalization__Camera_Ready__.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-8517
record_format dspace
spelling sg-smu-ink.sis_research-85172023-08-07T00:27:21Z Class is invariant to context and vice versa: On learning invariance for out-of-distribution generalization QI, Jiaxin TANG, Kaihua SUN, Qianru HUA, Xian-Sheng ZHANG, Hanwang Out-Of-Distribution generalization (OOD) is all about learning invariance against environmental changes. If the context in every class is evenly distributed, OOD would be trivial because the context can be easily removed due to an underlying principle: class is invariant to context. However, collecting such a balanced dataset is impractical. Learning on imbalanced data makes the model bias to context and thus hurts OOD. Therefore, the key to OOD is context balance.We argue that the widely adopted assumption in prior work—the context bias can be directly annotated or estimated from biased class prediction—renders the context incomplete or even incorrect. In contrast, we point out the everoverlooked other side of the above principle: context is also invariant to class, which motivates us to consider the classes (which are already labeled) as the varying environments to resolve context bias (without context labels). We implement this idea by minimizing the contrastive loss of intra-class sample similarity while assuring this similarity to be invariant across all classes. On benchmarks with various context biases and domain gaps, we show that a simple re-weighting based classifier equipped with our context estimation achieves state-of-the-art performance. 2022-10-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/7514 info:doi/10.1007/978-3-031-19806-9_6 https://ink.library.smu.edu.sg/context/sis_research/article/8517/viewcontent/ECCV2022__Class_Is_Invariant_to_Context_and_Vice_Versa__On_Learning_Invariance_forOut_Of_Distribution_Generalization__Camera_Ready__.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Databases and Information Systems Numerical Analysis and Scientific Computing
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Databases and Information Systems
Numerical Analysis and Scientific Computing
spellingShingle Databases and Information Systems
Numerical Analysis and Scientific Computing
QI, Jiaxin
TANG, Kaihua
SUN, Qianru
HUA, Xian-Sheng
ZHANG, Hanwang
Class is invariant to context and vice versa: On learning invariance for out-of-distribution generalization
description Out-Of-Distribution generalization (OOD) is all about learning invariance against environmental changes. If the context in every class is evenly distributed, OOD would be trivial because the context can be easily removed due to an underlying principle: class is invariant to context. However, collecting such a balanced dataset is impractical. Learning on imbalanced data makes the model bias to context and thus hurts OOD. Therefore, the key to OOD is context balance.We argue that the widely adopted assumption in prior work—the context bias can be directly annotated or estimated from biased class prediction—renders the context incomplete or even incorrect. In contrast, we point out the everoverlooked other side of the above principle: context is also invariant to class, which motivates us to consider the classes (which are already labeled) as the varying environments to resolve context bias (without context labels). We implement this idea by minimizing the contrastive loss of intra-class sample similarity while assuring this similarity to be invariant across all classes. On benchmarks with various context biases and domain gaps, we show that a simple re-weighting based classifier equipped with our context estimation achieves state-of-the-art performance.
format text
author QI, Jiaxin
TANG, Kaihua
SUN, Qianru
HUA, Xian-Sheng
ZHANG, Hanwang
author_facet QI, Jiaxin
TANG, Kaihua
SUN, Qianru
HUA, Xian-Sheng
ZHANG, Hanwang
author_sort QI, Jiaxin
title Class is invariant to context and vice versa: On learning invariance for out-of-distribution generalization
title_short Class is invariant to context and vice versa: On learning invariance for out-of-distribution generalization
title_full Class is invariant to context and vice versa: On learning invariance for out-of-distribution generalization
title_fullStr Class is invariant to context and vice versa: On learning invariance for out-of-distribution generalization
title_full_unstemmed Class is invariant to context and vice versa: On learning invariance for out-of-distribution generalization
title_sort class is invariant to context and vice versa: on learning invariance for out-of-distribution generalization
publisher Institutional Knowledge at Singapore Management University
publishDate 2022
url https://ink.library.smu.edu.sg/sis_research/7514
https://ink.library.smu.edu.sg/context/sis_research/article/8517/viewcontent/ECCV2022__Class_Is_Invariant_to_Context_and_Vice_Versa__On_Learning_Invariance_forOut_Of_Distribution_Generalization__Camera_Ready__.pdf
_version_ 1773551433916874752