A decoupled learning framework for contrastive learning

Contrastive Learning (CL) has attracted much attention in recent years because various self-supervised models based on CL achieve comparable performance to supervised models. Nevertheless, most CL frameworks require large batch size during the training progress for taking more negative samples in...

Full description

Saved in:
Bibliographic Details
Main Author: Xu, Yicheng
Other Authors: Lin Zhiping
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2022
Subjects:
Online Access:https://hdl.handle.net/10356/163711
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-163711
record_format dspace
spelling sg-ntu-dr.10356-1637112022-12-15T12:32:07Z A decoupled learning framework for contrastive learning Xu, Yicheng Lin Zhiping School of Electrical and Electronic Engineering EZPLin@ntu.edu.sg Engineering::Electrical and electronic engineering::Computer hardware, software and systems Contrastive Learning (CL) has attracted much attention in recent years because various self-supervised models based on CL achieve comparable performance to supervised models. Nevertheless, most CL frameworks require large batch size during the training progress for taking more negative samples into account to boost the performance. Meanwhile, the large model size limits the training batch size under fixed device memory. To solve this problem, we propose a Decoupled Updating Contrastive Learning (DUCL) framework 1) to divide a single model into pieces to shrink the model size on each accelerator device and 2) to decouple every batch in CL for memory saving. The combination of both approaches enables a larger negative sample space for contrastive learning models to achieve better performance. As a result, we prove the effectiveness of large batch size and save the memory to a maximum of 43% in our experiments. By incorporating our learning method, the contrastive learning model can be trained with a larger negative sample space thus improving its performance without making any change for the model structure. Master of Science (Computer Control and Automation) 2022-12-15T12:32:07Z 2022-12-15T12:32:07Z 2022 Thesis-Master by Coursework Xu, Y. (2022). A decoupled learning framework for contrastive learning. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/163711 https://hdl.handle.net/10356/163711 en application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering::Computer hardware, software and systems
spellingShingle Engineering::Electrical and electronic engineering::Computer hardware, software and systems
Xu, Yicheng
A decoupled learning framework for contrastive learning
description Contrastive Learning (CL) has attracted much attention in recent years because various self-supervised models based on CL achieve comparable performance to supervised models. Nevertheless, most CL frameworks require large batch size during the training progress for taking more negative samples into account to boost the performance. Meanwhile, the large model size limits the training batch size under fixed device memory. To solve this problem, we propose a Decoupled Updating Contrastive Learning (DUCL) framework 1) to divide a single model into pieces to shrink the model size on each accelerator device and 2) to decouple every batch in CL for memory saving. The combination of both approaches enables a larger negative sample space for contrastive learning models to achieve better performance. As a result, we prove the effectiveness of large batch size and save the memory to a maximum of 43% in our experiments. By incorporating our learning method, the contrastive learning model can be trained with a larger negative sample space thus improving its performance without making any change for the model structure.
author2 Lin Zhiping
author_facet Lin Zhiping
Xu, Yicheng
format Thesis-Master by Coursework
author Xu, Yicheng
author_sort Xu, Yicheng
title A decoupled learning framework for contrastive learning
title_short A decoupled learning framework for contrastive learning
title_full A decoupled learning framework for contrastive learning
title_fullStr A decoupled learning framework for contrastive learning
title_full_unstemmed A decoupled learning framework for contrastive learning
title_sort decoupled learning framework for contrastive learning
publisher Nanyang Technological University
publishDate 2022
url https://hdl.handle.net/10356/163711
_version_ 1753801160608710656