Mnemonics training: Multi-class incremental learning without forgetting

Multi-Class Incremental Learning (MCIL) aims to learn new concepts by incrementally updating a model trained on previous concepts. However, there is an inherent trade-off to effectively learning new concepts without catastrophic forgetting of previous ones. To alleviate this issue, it has been propo...

Full description

Saved in:
Bibliographic Details
Main Authors: LIU, Yaoyao, SU, Yuting, LIU, An-An, SCHIELE, Bernt, SUN, Qianru
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2020
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/5593
https://ink.library.smu.edu.sg/context/sis_research/article/6596/viewcontent/Liu_Mnemonics_Training_Multi_Class_Incremental_Learning_Without_Forgetting_CVPR_2020_paper.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-6596
record_format dspace
spelling sg-smu-ink.sis_research-65962021-01-07T14:00:19Z Mnemonics training: Multi-class incremental learning without forgetting LIU, Yaoyao SU, Yuting LIU, An-An SCHIELE, Bernt SUN, Qianru Multi-Class Incremental Learning (MCIL) aims to learn new concepts by incrementally updating a model trained on previous concepts. However, there is an inherent trade-off to effectively learning new concepts without catastrophic forgetting of previous ones. To alleviate this issue, it has been proposed to keep around a few examples of the previous concepts but the effectiveness of this approach heavily depends on the representativeness of these examples. This paper proposes a novel and automatic framework we call mnemonics, where we parameterize exemplars and make them optimizable in an end-to-end manner. We train the framework through bilevel optimizations, i.e., model-level and exemplar-level. We conduct extensive experiments on three MCIL benchmarks, CIFAR-100, ImageNet-Subset and ImageNet, and show that using mnemonics exemplars can surpass the state-of-the-art by a large margin. Interestingly and quite intriguingly, the mnemonics exemplars tend to be on the boundaries between different classes. 2020-06-01T07:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/5593 info:doi/10.1109/CVPR42600.2020.01226 https://ink.library.smu.edu.sg/context/sis_research/article/6596/viewcontent/Liu_Mnemonics_Training_Multi_Class_Incremental_Learning_Without_Forgetting_CVPR_2020_paper.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Databases and Information Systems Graphics and Human Computer Interfaces
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Databases and Information Systems
Graphics and Human Computer Interfaces
spellingShingle Databases and Information Systems
Graphics and Human Computer Interfaces
LIU, Yaoyao
SU, Yuting
LIU, An-An
SCHIELE, Bernt
SUN, Qianru
Mnemonics training: Multi-class incremental learning without forgetting
description Multi-Class Incremental Learning (MCIL) aims to learn new concepts by incrementally updating a model trained on previous concepts. However, there is an inherent trade-off to effectively learning new concepts without catastrophic forgetting of previous ones. To alleviate this issue, it has been proposed to keep around a few examples of the previous concepts but the effectiveness of this approach heavily depends on the representativeness of these examples. This paper proposes a novel and automatic framework we call mnemonics, where we parameterize exemplars and make them optimizable in an end-to-end manner. We train the framework through bilevel optimizations, i.e., model-level and exemplar-level. We conduct extensive experiments on three MCIL benchmarks, CIFAR-100, ImageNet-Subset and ImageNet, and show that using mnemonics exemplars can surpass the state-of-the-art by a large margin. Interestingly and quite intriguingly, the mnemonics exemplars tend to be on the boundaries between different classes.
format text
author LIU, Yaoyao
SU, Yuting
LIU, An-An
SCHIELE, Bernt
SUN, Qianru
author_facet LIU, Yaoyao
SU, Yuting
LIU, An-An
SCHIELE, Bernt
SUN, Qianru
author_sort LIU, Yaoyao
title Mnemonics training: Multi-class incremental learning without forgetting
title_short Mnemonics training: Multi-class incremental learning without forgetting
title_full Mnemonics training: Multi-class incremental learning without forgetting
title_fullStr Mnemonics training: Multi-class incremental learning without forgetting
title_full_unstemmed Mnemonics training: Multi-class incremental learning without forgetting
title_sort mnemonics training: multi-class incremental learning without forgetting
publisher Institutional Knowledge at Singapore Management University
publishDate 2020
url https://ink.library.smu.edu.sg/sis_research/5593
https://ink.library.smu.edu.sg/context/sis_research/article/6596/viewcontent/Liu_Mnemonics_Training_Multi_Class_Incremental_Learning_Without_Forgetting_CVPR_2020_paper.pdf
_version_ 1770575521246085120