Unveiling code pre-trained models: Investigating syntax and semantics capacities
Code models have made significant advancements in code intelligence by encoding knowledge about programming languages. While previous studies have explored the capabilities of these models in learning code syntax, there has been limited investigation on their ability to understand code semantics. Ad...
Saved in:
Main Authors: | MA, Wei, LIU, Shangqing, ZHAO, Mengjie, XIE, Xiaofei, WANG, Wenhang, HU, Qiang, ZHANG, Jie, YANG, Liu |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2024
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/9092 https://ink.library.smu.edu.sg/context/sis_research/article/10095/viewcontent/3664606.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Does semantics aid syntax? An empirical study on named entity recognition and classification
by: Zhong, Xiaoshi, et al.
Published: (2022) -
A UTP semantics for communicating processes with shared variables and its formal encoding in PVS
by: SHI, Ling, et al.
Published: (2018) -
The semantics and grammar of Vietnamese classifiers
by: SIM SOOK HUI
Published: (2010) -
Learning program semantics with code representations: An empirical study
by: SIOW, Jing Kai, et al.
Published: (2022) -
GraphCode2Vec: Generic code embedding via lexical and program dependence analyses
by: MA, Wei, et al.
Published: (2022)