Please use this identifier to cite or link to this item:
Title: A decoupled learning framework for contrastive learning
Authors: Xu, Yicheng
Keywords: Engineering::Electrical and electronic engineering::Computer hardware, software and systems
Issue Date: 2022
Publisher: Nanyang Technological University
Source: Xu, Y. (2022). A decoupled learning framework for contrastive learning. Master's thesis, Nanyang Technological University, Singapore.
Abstract: Contrastive Learning (CL) has attracted much attention in recent years because various self-supervised models based on CL achieve comparable performance to supervised models. Nevertheless, most CL frameworks require large batch size during the training progress for taking more negative samples into account to boost the performance. Meanwhile, the large model size limits the training batch size under fixed device memory. To solve this problem, we propose a Decoupled Updating Contrastive Learning (DUCL) framework 1) to divide a single model into pieces to shrink the model size on each accelerator device and 2) to decouple every batch in CL for memory saving. The combination of both approaches enables a larger negative sample space for contrastive learning models to achieve better performance. As a result, we prove the effectiveness of large batch size and save the memory to a maximum of 43% in our experiments. By incorporating our learning method, the contrastive learning model can be trained with a larger negative sample space thus improving its performance without making any change for the model structure.
Schools: School of Electrical and Electronic Engineering 
Fulltext Permission: restricted
Fulltext Availability: With Fulltext
Appears in Collections:EEE Theses

Files in This Item:
File Description SizeFormat 
  Restricted Access
2.55 MBAdobe PDFView/Open

Page view(s)

Updated on Apr 19, 2024


Updated on Apr 19, 2024

Google ScholarTM


Items in DR-NTU are protected by copyright, with all rights reserved, unless otherwise indicated.