Preventing catastrophic forgetting in continual learning
Continual learning in neural networks has been receiving increased interest due to how prevalent machine learning is in an increasing number of industries. Catastrophic forgetting, which is when a model forgets old tasks upon learning new tasks, is still a major roadblock in allowing neural netwo...
Saved in:
Main Author: | Ong, Yi Shen |
---|---|
Other Authors: | Lin Guosheng |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/162924 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
Overcoming catastrophic forgetting through replay in continual learning
by: Qiao, Zhongzheng
Published: (2021) -
Preventing Catastrophic Forgetting and Distribution Mismatch in Knowledge Distillation via Synthetic Data
by: Kuluhan Binici, et al.
Published: (2022) -
Learning to forget in an online fuzzy neural network using dynamic forgetting window
by: Tan, Benjamin Kok Loong.
Published: (2013) -
Parking but not forgetting with one button-click
by: Ha, Robin Zhi Wei
Published: (2012) -
Neural modeling of episodic memory : encoding, retrieval, and forgetting
by: Wang, Wenwen, et al.
Published: (2013)