Towards a smaller student: Capacity dynamic distillation for efficient image retrieval
Previous Knowledge Distillation based efficient image retrieval methods employ a lightweight network as the student model for fast inference. However, the lightweight student model lacks adequate representation capacity for effective knowledge imitation during the most critical early training period...
Saved in:
Main Authors: | XIE, Yi, ZHANG, Huaidong, XU, Xuemiao, ZHU, Jianqing, HE, Shengfeng |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2023
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/8448 https://ink.library.smu.edu.sg/context/sis_research/article/9451/viewcontent/TowardsSmallerStudent_IR_av_cc_by.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
Differentiated learning for multi-modal domain adaptation
by: LV, Jianming, et al.
Published: (2021)
by: LV, Jianming, et al.
Published: (2021)
Similar Items
-
D3still : Decoupled differential distillation for asymmetric image retrieval
by: XIE, Yi, et al.
Published: (2024) -
Contextual-assisted scratched photo restoration
by: CAI, Weiwei, et al.
Published: (2023) -
Efficient near-duplicate keyframe retrieval with visual language models
by: WU, Xiao, et al.
Published: (2007) -
Iterative graph self-distillation
by: ZHANG, Hanlin, et al.
Published: (2024) -
Deep-based ingredient recognition for cooking recipe retrieval
by: CHEN, Jingjing, et al.
Published: (2016)