An exploratory study on code attention in BERT
Many recent models in software engineering introduced deep neural models based on the Transformer architecture or use transformerbased Pre-trained Language Models (PLM) trained on code. Although these models achieve the state of the arts results in many downstream tasks such as code summarization an...
Saved in:
Main Authors: | SHARMA, Rishab, CHEN, Fuxiang, FARD, Fatemeh H., LO, David |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2022
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/7694 https://ink.library.smu.edu.sg/context/sis_research/article/8697/viewcontent/An_Exploratory.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Assessing generalizability of CodeBERT
by: ZHOU, Xin, et al.
Published: (2021) -
Knowledge-based BERT word embedding fine-tuning for emotion recognition
by: Zhu, Zixiao, et al.
Published: (2023) -
On the transferability of pre-trained language models for low-resource programming languages
by: CHEN, Fuxiang, et al.
Published: (2022) -
VulCurator: a vulnerability-fixing commit detector
by: NGUYEN, Truong Giang, et al.
Published: (2022) -
SOCIAL MEDIA BUZZ AND NEW PRODUCT ADOPTION
by: VIVEK SUNDAR MAGESH
Published: (2021)