The effect of softmax temperature on recent knowledge distillation algorithms

Knowledge distillation is a technique to transfer the knowledge from a large and complex teacher model to a smaller and faster student model, and an important category among methods of model compression. In this study, I survey various knowledge distillation algorithms that have been proposed in re...

Full description

Saved in:
Bibliographic Details
Main Author: Poh, Dominique
Other Authors: Weichen Liu
Format: Final Year Project
Language:English
Published: Nanyang Technological University 2023
Subjects:
Online Access:https://hdl.handle.net/10356/172431
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Be the first to leave a comment!
You must be logged in first