Co-advise: Cross inductive bias distillation
The inductive bias of vision transformers is more relaxed that cannot work well with insufficient data. Knowledge distillation is thus introduced to assist the training of transformers. Unlike previous works, where merely heavy convolution-based teachers are provided, in this paper, we delve into th...
Saved in:
Main Authors: | REN, Sucheng, GAO, Zhengqi, HUA, Tiany, XUE, Zihui, TIAN, Yonglong, HE, Shengfeng, ZHAO, Hang |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2022
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/8538 https://ink.library.smu.edu.sg/context/sis_research/article/9541/viewcontent/Co_Advise__Cross_Inductive_Bias_Distillation.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
Multimodal distillation for egocentric video understanding
by: Peng, Han
Published: (2024) -
Privacy Risks of Securing Machine Learning Models against Adversarial Examples
by: Liwei Song, et al.
Published: (2020) -
Bias field poses a threat to DNN-based X-ray recognition
by: TIAN, Bingyu, et al.
Published: (2021) -
Solar Power Integration in Water (H2O) Distillation (SPIN-HD)
by: Carlos, Charlize F., et al.
Published: (2021) -
On-the-fly knowledge distillation model for sentence embedding
by: Zhu, Xuchun
Published: (2024)