Compressing pre-trained models of code into 3 MB

Although large pre-trained models of code have delivered significant advancements in various code processing tasks, there is an impediment to the wide and fluent adoption of these powerful models in software developers’ daily workflow: these large models consume hundreds of megabytes of memory and r...

Full description

Saved in:
Bibliographic Details
Main Authors: SHI, Jieke, YANG, Zhou, XU, Bowen, KANG, Hong Jin, LO, David
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2022
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/7725
https://ink.library.smu.edu.sg/context/sis_research/article/8728/viewcontent/3551349.3556964_pvoa.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English

Similar Items