The effect of softmax temperature on recent knowledge distillation algorithms

Knowledge distillation is a technique to transfer the knowledge from a large and complex teacher model to a smaller and faster student model, and an important category among methods of model compression. In this study, I survey various knowledge distillation algorithms that have been proposed in re...

全面介紹

Saved in:
書目詳細資料
主要作者: Poh, Dominique
其他作者: Weichen Liu
格式: Final Year Project
語言:English
出版: Nanyang Technological University 2023
主題:
在線閱讀:https://hdl.handle.net/10356/172431
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!