ContraBERT: Enhancing code pre-trained models via contrastive learning

Large-scale pre-trained models such as CodeBERT, GraphCodeBERT have earned widespread attention from both academia and industry. Attributed to the superior ability in code representation, they have been further applied in multiple downstream tasks such as clone detection, code search and code transl...

وصف كامل

محفوظ في:
التفاصيل البيبلوغرافية
المؤلفون الرئيسيون: LIU, Shangqing, WU, Bozhi, XIE, Xiaofei, MENG, Guozhu, LIU, Yang.
التنسيق: text
اللغة:English
منشور في: Institutional Knowledge at Singapore Management University 2023
الموضوعات:
الوصول للمادة أونلاين:https://ink.library.smu.edu.sg/sis_research/8228
https://ink.library.smu.edu.sg/context/sis_research/article/9231/viewcontent/2301.09072.pdf
الوسوم: إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!