ContraBERT: Enhancing code pre-trained models via contrastive learning

Large-scale pre-trained models such as CodeBERT, GraphCodeBERT have earned widespread attention from both academia and industry. Attributed to the superior ability in code representation, they have been further applied in multiple downstream tasks such as clone detection, code search and code transl...

Full description

Saved in:
Bibliographic Details
Main Authors: LIU, Shangqing, WU, Bozhi, XIE, Xiaofei, MENG, Guozhu, LIU, Yang.
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2023
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8228
https://ink.library.smu.edu.sg/context/sis_research/article/9231/viewcontent/2301.09072.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English

Similar Items