Improving tail-class representation with centroid contrastive learning

In vision domain, large-scale natural datasets typically exhibit long-tailed distribution which has large class imbalance between head and tail classes. This distribution poses difficulty in learning good representations for tail classes. Recent developments have shown good long-tailed model can be...

Full description

Saved in:
Bibliographic Details
Main Authors: Tiong, Anthony Meng Huat, Li, Junnan, Lin, Guosheng, Li, Boyang, Xiong, Caiming, Hoi, Steven C. H.
Other Authors: School of Computer Science and Engineering
Format: Article
Language:English
Published: 2023
Subjects:
Online Access:https://hdl.handle.net/10356/172214
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-172214
record_format dspace
spelling sg-ntu-dr.10356-1722142023-11-29T06:15:24Z Improving tail-class representation with centroid contrastive learning Tiong, Anthony Meng Huat Li, Junnan Lin, Guosheng Li, Boyang Xiong, Caiming Hoi, Steven C. H. School of Computer Science and Engineering Engineering::Computer science and engineering Imbalanced Learning Contrastive Learning In vision domain, large-scale natural datasets typically exhibit long-tailed distribution which has large class imbalance between head and tail classes. This distribution poses difficulty in learning good representations for tail classes. Recent developments have shown good long-tailed model can be learnt by decoupling the training into representation learning and classifier balancing. However, these works pay insufficient consideration on the long-tailed effect on representation learning. In this work, we propose interpolative centroid contrastive learning (ICCL) to improve long-tailed representation learning. ICCL interpolates two images from a class-agnostic sampler and a class-aware sampler, and trains the model such that the representation of the interpolative image can be used to retrieve the centroids for both source classes. We demonstrate the effectiveness of our approach on multiple long-tailed image classification benchmarks. Nanyang Technological University National Research Foundation (NRF) Anthony Meng Huat Tiong is supported by Salesforce and the Singapore Economic Development Board under the Industrial Postgraduate Programme. Boyang Li is supported by the Nanyang Associate Professorship and the National Research Foundation Fellowship (NRF-NRFF13-2021-0006), Singapore. 2023-11-29T06:15:24Z 2023-11-29T06:15:24Z 2023 Journal Article Tiong, A. M. H., Li, J., Lin, G., Li, B., Xiong, C. & Hoi, S. C. H. (2023). Improving tail-class representation with centroid contrastive learning. Pattern Recognition Letters, 168, 123-130. https://dx.doi.org/10.1016/j.patrec.2023.03.010 0167-8655 https://hdl.handle.net/10356/172214 10.1016/j.patrec.2023.03.010 2-s2.0-85150821315 168 123 130 en NRF-NRFF13-2021-0006 Pattern Recognition Letters © 2023 Elsevier B.V. All rights reserved.
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Computer science and engineering
Imbalanced Learning
Contrastive Learning
spellingShingle Engineering::Computer science and engineering
Imbalanced Learning
Contrastive Learning
Tiong, Anthony Meng Huat
Li, Junnan
Lin, Guosheng
Li, Boyang
Xiong, Caiming
Hoi, Steven C. H.
Improving tail-class representation with centroid contrastive learning
description In vision domain, large-scale natural datasets typically exhibit long-tailed distribution which has large class imbalance between head and tail classes. This distribution poses difficulty in learning good representations for tail classes. Recent developments have shown good long-tailed model can be learnt by decoupling the training into representation learning and classifier balancing. However, these works pay insufficient consideration on the long-tailed effect on representation learning. In this work, we propose interpolative centroid contrastive learning (ICCL) to improve long-tailed representation learning. ICCL interpolates two images from a class-agnostic sampler and a class-aware sampler, and trains the model such that the representation of the interpolative image can be used to retrieve the centroids for both source classes. We demonstrate the effectiveness of our approach on multiple long-tailed image classification benchmarks.
author2 School of Computer Science and Engineering
author_facet School of Computer Science and Engineering
Tiong, Anthony Meng Huat
Li, Junnan
Lin, Guosheng
Li, Boyang
Xiong, Caiming
Hoi, Steven C. H.
format Article
author Tiong, Anthony Meng Huat
Li, Junnan
Lin, Guosheng
Li, Boyang
Xiong, Caiming
Hoi, Steven C. H.
author_sort Tiong, Anthony Meng Huat
title Improving tail-class representation with centroid contrastive learning
title_short Improving tail-class representation with centroid contrastive learning
title_full Improving tail-class representation with centroid contrastive learning
title_fullStr Improving tail-class representation with centroid contrastive learning
title_full_unstemmed Improving tail-class representation with centroid contrastive learning
title_sort improving tail-class representation with centroid contrastive learning
publishDate 2023
url https://hdl.handle.net/10356/172214
_version_ 1783955529290219520