Wakening past concepts without past data: Class-incremental learning from online placebos
Not forgetting old class knowledge is a key challenge for class-incremental learning (CIL) when the model continuously adapts to new classes. A common technique to address this is knowledge distillation (KD), which penalizes prediction inconsistencies between old and new models. Such prediction is m...
Saved in:
Main Authors: | LIU, Yaoyao, LI, Yingying, SCHIELE, Bernt, SUN, Qianru |
---|---|
Format: | text |
Language: | English |
Published: |
Institutional Knowledge at Singapore Management University
2024
|
Subjects: | |
Online Access: | https://ink.library.smu.edu.sg/sis_research/9207 https://ink.library.smu.edu.sg/context/sis_research/article/10212/viewcontent/Liu_Wakening_Past_Concepts_Without_Past_Data_Class_Incremental_Learning_From_Online_WACV_2024_paper.pdf |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Singapore Management University |
Language: | English |
Similar Items
-
INCREMENTAL LEARNING IN NON-STATIONARY ENVIRONMENTS
by: ABHINIT KUMAR AMBASTHA
Published: (2023) -
Incremental evolution of classifier agents using incremental genetic algorithms
by: ZHU FANGMING
Published: (2010) -
Incremental learning with respect to new incoming input attributes
by: Guan, S.-U., et al.
Published: (2014) -
IMPROVING ATTENTION-BASED DEEP LEARNING MODELS WITH LOCALITY
by: JIANG ZIHANG
Published: (2023) -
Towards Robust ResNet: A Small Step but A Giant Leap
by: Jingfeng Zhang, et al.
Published: (2020)