On the transferability of pre-trained language models for low-resource programming languages
A recent study by Ahmed and Devanbu reported that using a corpus of code written in multilingual datasets to fine-tune multilingual Pre-trained Language Models (PLMs) achieves higher performance as opposed to using a corpus of code written in just one programming language. However, no analysis was m...
Saved in:
Main Authors: | CHEN, Fuxiang, FARD, Fatemeh H., LO, David, BRYKSIN, Timofey |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2022
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/7693 https://ink.library.smu.edu.sg/context/sis_research/article/8696/viewcontent/On_the_transfer.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
An exploratory study on code attention in BERT
by: SHARMA, Rishab, et al.
Published: (2022) -
On the usage of continual learning for out-of-distribution generalization in pre-trained language models of code
by: WEYSSOW, Martin, et al.
Published: (2023) -
Injecting descriptive meta-information into pre-trained language models with hypernetworks
by: DUAN, Wenying, et al.
Published: (2021) -
Using pre-trained models for vision-language understanding tasks
by: CAO, Rui
Published: (2024) -
Augmenting low-resource text classification with graph-grounded pre-training and prompting
by: WEN, Zhihao, et al.
Published: (2023)