Knowledge distillation in computer vision models
Knowledge distillation has gained significant popularity in the Vision Transformer (ViT) space as a powerful approach to enhance the efficiency of a small lightweight model. Knowledge distillation enables a larger and complex “teacher” model to relay its knowledge to a smaller “student” model. Th...
Saved in:
Main Author: | Yeoh, Yu Shyan |
---|---|
Other Authors: | Lin Guosheng |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/181128 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
On-the-fly knowledge distillation model for sentence embedding
by: Zhu, Xuchun
Published: (2024) -
Computation-efficient knowledge distillation via uncertainty-aware mixup
by: Xu, Guodong, et al.
Published: (2023) -
TraVis: Web-based vehicle classification and counter using computer vision
by: Aguirre, Nino Byron F., et al.
Published: (2015) -
Boosting knowledge distillation and interpretability
by: Song, Huan
Published: (2021) -
Two-stage edge-side fault diagnosis method based on double knowledge distillation
by: Yang, Yang, et al.
Published: (2024)