Wakening past concepts without past data: Class-incremental learning from online placebos

Not forgetting old class knowledge is a key challenge for class-incremental learning (CIL) when the model continuously adapts to new classes. A common technique to address this is knowledge distillation (KD), which penalizes prediction inconsistencies between old and new models. Such prediction is m...

全面介紹

Saved in:
書目詳細資料
Main Authors: LIU, Yaoyao, LI, Yingying, SCHIELE, Bernt, SUN, Qianru
格式: text
語言:English
出版: Institutional Knowledge at Singapore Management University 2024
主題:
在線閱讀:https://ink.library.smu.edu.sg/sis_research/9207
https://ink.library.smu.edu.sg/context/sis_research/article/10212/viewcontent/Liu_Wakening_Past_Concepts_Without_Past_Data_Class_Incremental_Learning_From_Online_WACV_2024_paper.pdf
標簽: 添加標簽
沒有標簽, 成為第一個標記此記錄!