The effect of softmax temperature on recent knowledge distillation algorithms
Knowledge distillation is a technique to transfer the knowledge from a large and complex teacher model to a smaller and faster student model, and an important category among methods of model compression. In this study, I survey various knowledge distillation algorithms that have been proposed in re...
Saved in:
Main Author: | Poh, Dominique |
---|---|
Other Authors: | Weichen Liu |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2023
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/172431 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
Knowledge representation for agent negotiation in electronic commerce
by: Chi, Kok Poh.
Published: (2008) -
Dynamic knowledge graph embedding
by: Teo, Eugene Yu-jie
Published: (2021) -
The use of knowledge graph for recommendation explanation
by: Yap, Desmond Qing Yang
Published: (2023) -
An empirical study of the inherent resistance of knowledge distillation based federated learning to targeted poisoning attacks
by: He, Weiyang, et al.
Published: (2024) -
A knowledge graph based survey system for Ikigai
by: Lua, Emily Jia Ning
Published: (2022)