An exploratory study on code attention in BERT

Many recent models in software engineering introduced deep neural models based on the Transformer architecture or use transformerbased Pre-trained Language Models (PLM) trained on code. Although these models achieve the state of the arts results in many downstream tasks such as code summarization an...

Full description

Saved in:
Bibliographic Details
Main Authors: SHARMA, Rishab, CHEN, Fuxiang, FARD, Fatemeh H., LO, David
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2022
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/7694
https://ink.library.smu.edu.sg/context/sis_research/article/8697/viewcontent/An_Exploratory.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English

Similar Items