Knowledge distillation in computer vision models
Knowledge distillation has gained significant popularity in the Vision Transformer (ViT) space as a powerful approach to enhance the efficiency of a small lightweight model. Knowledge distillation enables a larger and complex “teacher” model to relay its knowledge to a smaller “student” model. Th...
Saved in:
Main Author: | Yeoh, Yu Shyan |
---|---|
Other Authors: | Lin Guosheng |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/181128 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
Skin beauty adviser assistant based on large language model and computer vision
by: Jiang, Yuwei
Published: (2025) -
On-the-fly knowledge distillation model for sentence embedding
by: Zhu, Xuchun
Published: (2024) -
TraVis: Web-based vehicle classification and counter using computer vision
by: Aguirre, Nino Byron F., et al.
Published: (2015) -
Computation-efficient knowledge distillation via uncertainty-aware mixup
by: Xu, Guodong, et al.
Published: (2023) -
Digital Theremin with computer vision
by: Chua, Ryuichi
Published: (2024)