PrefAce: face-centric pretraining with self-structure aware distillation
Video-based facial analysis is important for autonomous agents to understand human expressions and sentiments. However, limited labeled data is available to learn effective facial representations. This paper proposes a novel self-supervised face-centric pretraining framework, called PrefAce, which l...
Saved in:
Main Author: | Hu, Siyuan |
---|---|
Other Authors: | Ong Yew Soon |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/175280 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
Human-centric AI security
by: Ling, Shahrul Al-Nizam
Published: (2020) -
Modelling self-awareness in social robot
by: Zhang, Jiaheng
Published: (2020) -
Distillation and self-training in lane detection
by: Ngo, Jia Wei
Published: (2020) -
Improving neural machine translation: data centric approaches
by: Nguyen, Xuan Phi
Published: (2023) -
The effect of softmax temperature on recent knowledge distillation algorithms
by: Poh, Dominique
Published: (2023)