ContraBERT: Enhancing code pre-trained models via contrastive learning
Large-scale pre-trained models such as CodeBERT, GraphCodeBERT have earned widespread attention from both academia and industry. Attributed to the superior ability in code representation, they have been further applied in multiple downstream tasks such as clone detection, code search and code transl...
Saved in:
Main Authors: | LIU, Shangqing, WU, Bozhi, XIE, Xiaofei, MENG, Guozhu, LIU, Yang. |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2023
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/8228 https://ink.library.smu.edu.sg/context/sis_research/article/9231/viewcontent/2301.09072.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Interpreting CodeBERT for semantic code clone detection
by: ABID, Shamsa, et al.
Published: (2023) -
GraphSearchNet: Enhancing GNNs via capturing global dependencies for semantic code search
by: LIU, Shangqing, et al.
Published: (2023) -
An exploratory study on code attention in BERT
by: SHARMA, Rishab, et al.
Published: (2022) -
Unveiling code pre-trained models: Investigating syntax and semantics capacities
by: MA, Wei, et al.
Published: (2024) -
FT2Ra: A fine-tuning-inspired approach to retrieval-augmented code completion
by: GUO, Qi, et al.
Published: (2024)