Wakening past concepts without past data: Class-incremental learning from online placebos
Not forgetting old class knowledge is a key challenge for class-incremental learning (CIL) when the model continuously adapts to new classes. A common technique to address this is knowledge distillation (KD), which penalizes prediction inconsistencies between old and new models. Such prediction is m...
محفوظ في:
المؤلفون الرئيسيون: | , , , |
---|---|
التنسيق: | text |
اللغة: | English |
منشور في: |
Institutional Knowledge at Singapore Management University
2024
|
الموضوعات: | |
الوصول للمادة أونلاين: | https://ink.library.smu.edu.sg/sis_research/9207 https://ink.library.smu.edu.sg/context/sis_research/article/10212/viewcontent/Liu_Wakening_Past_Concepts_Without_Past_Data_Class_Incremental_Learning_From_Online_WACV_2024_paper.pdf |
الوسوم: |
إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!
|
المؤسسة: | Singapore Management University |
اللغة: | English |