Knowledge distillation in computer vision models
Knowledge distillation has gained significant popularity in the Vision Transformer (ViT) space as a powerful approach to enhance the efficiency of a small lightweight model. Knowledge distillation enables a larger and complex “teacher” model to relay its knowledge to a smaller “student” model. Th...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/181128 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Be the first to leave a comment!