On the usage of continual learning for out-of-distribution generalization in pre-trained language models of code
Pre-trained language models (PLMs) have become a prevalent technique in deep learning for code, utilizing a two-stage pre-training and fine-tuning procedure to acquire general knowledge about code and specialize in a variety of downstream tasks. However, the dynamic nature of software codebases pose...
Saved in:
Main Authors: | , , , , |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2023
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/8574 https://ink.library.smu.edu.sg/context/sis_research/article/9577/viewcontent/On_the_Usage_of_Continual_Learning_for_Out_of_Distribution_Generalization_in_Pre_trained_Language_Models_of_Code.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |