Knowledge Distillation with Relative Representations for Image Representation Learning
Relative representations allow the alignment of latent spaces which embed data in extrinsically different manners but with similar relative distances between data points. This ability to compare different latent spaces for the same input lends itself to knowledge distillation techniques. We explore...
Saved in:
Main Authors: | Ramos, Patrick, Alampay, Raphael, Abu, Patricia Angela R |
---|---|
Format: | text |
Published: |
Archīum Ateneo
2023
|
Subjects: | |
Online Access: | https://archium.ateneo.edu/discs-faculty-pubs/387 https://doi.org/10.1007/978-3-031-41630-9_14 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Ateneo De Manila University |
Similar Items
-
Computation-efficient knowledge distillation via uncertainty-aware mixup
by: Xu, Guodong, et al.
Published: (2023) -
Visual-to-EEG cross-modal knowledge distillation for continuous emotion recognition
by: Zhang, Su, et al.
Published: (2022) -
Edge-computing-based knowledge distillation and multitask learning for partial discharge recognition
by: Ji, Jinsheng, et al.
Published: (2024) -
One-class knowledge distillation for face presentation attack detection
by: Li, Zhi, et al.
Published: (2023) -
Discriminator-enhanced knowledge-distillation networks
by: Li, Zhenping, et al.
Published: (2023)