GraphCode2Vec: Generic code embedding via lexical and program dependence analyses

Code embedding is a keystone in the application of machine learning on several Software Engineering (SE) tasks. To effectively support a plethora of SE tasks, the embedding needs to capture program syntax and semantics in a way that is generic. To this end, we propose the first self-supervised pre-t...

Full description

Saved in:
Bibliographic Details
Main Authors: MA, Wei, ZHAO, Mengjie, SOREMEKUN, Ezekiel, HU, Qiang, ZHANG, Jie M., PAPADAKIS, Mike, CORDY, Maxime, Xiaofei XIE, LE TRAON, Yves
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2022
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8345
https://ink.library.smu.edu.sg/context/sis_research/article/9348/viewcontent/GraphCode2Vec_av.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
Description
Summary:Code embedding is a keystone in the application of machine learning on several Software Engineering (SE) tasks. To effectively support a plethora of SE tasks, the embedding needs to capture program syntax and semantics in a way that is generic. To this end, we propose the first self-supervised pre-training approach (called Graphcode2vec) which produces task-agnostic embedding of lexical and program dependence features. Graphcode2vec achieves this via a synergistic combination of code analysis and Graph Neural Networks. Graphcode2vec is generic, it allows pre-training, and it is applicable to several SE downstream tasks. We evaluate the effectiveness of Graphcode2vec on four (4) tasks (method name prediction, solution classification, mutation testing and overfitted patch classification), and compare it with four (4) similarly generic code embedding baselines (Code2Seq, Code2Vec, CodeBERT, Graph-CodeBERT) and seven (7) task-specific, learning-based methods. In particular, Graphcode2vec is more effective than both generic and task-specific learning-based baselines. It is also complementary and comparable to GraphCodeBERT (a larger and more complex model). We also demonstrate through a probing and ablation study that Graphcode2vec learns lexical and program dependence features and that self-supervised pre-training improves effectiveness.