Continual learning via inter-task synaptic mapping
Learning from streaming tasks leads a model to catastrophically erase unique experiences it absorbs from previous episodes. While regularization techniques such as LWF, SI, EWC have proven themselves as an effective avenue to overcome this issue by constraining important parameters of old tasks f...
Saved in:
Main Authors: | , , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/160691 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Summary: | Learning from streaming tasks leads a model to catastrophically erase unique
experiences it absorbs from previous episodes. While regularization techniques
such as LWF, SI, EWC have proven themselves as an effective avenue to overcome
this issue by constraining important parameters of old tasks from changing when
accepting new concepts, these approaches do not exploit common information of
each task which can be shared to existing neurons. As a result, they do not
scale well to large-scale problems since the parameter importance variables
quickly explode. An Inter-Task Synaptic Mapping (ISYANA) is proposed here to
underpin knowledge retention for continual learning. ISYANA combines
task-to-neuron relationship as well as concept-to-concept relationship such
that it prevents a neuron to embrace distinct concepts while merely accepting
relevant concept. Numerical study in the benchmark continual learning problems
has been carried out followed by comparison against prominent continual
learning algorithms. ISYANA exhibits competitive performance compared to state
of the arts. Codes of ISYANA is made available in https://github.com/ContinualAL/ISYANAKBS. |
---|