Wakening past concepts without past data: Class-incremental learning from online placebos

Not forgetting old class knowledge is a key challenge for class-incremental learning (CIL) when the model continuously adapts to new classes. A common technique to address this is knowledge distillation (KD), which penalizes prediction inconsistencies between old and new models. Such prediction is m...

Full description

Saved in:
Bibliographic Details
Main Authors: LIU, Yaoyao, LI, Yingying, SCHIELE, Bernt, SUN, Qianru
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2024
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/9207
https://ink.library.smu.edu.sg/context/sis_research/article/10212/viewcontent/Liu_Wakening_Past_Concepts_Without_Past_Data_Class_Incremental_Learning_From_Online_WACV_2024_paper.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
Be the first to leave a comment!
You must be logged in first