Preventing catastrophic forgetting in continual learning
Continual learning in neural networks has been receiving increased interest due to how prevalent machine learning is in an increasing number of industries. Catastrophic forgetting, which is when a model forgets old tasks upon learning new tasks, is still a major roadblock in allowing neural netwo...
Saved in:
Main Author: | |
---|---|
Other Authors: | |
Format: | Final Year Project |
Language: | English |
Published: |
Nanyang Technological University
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/162924 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-162924 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1629242022-11-14T03:52:28Z Preventing catastrophic forgetting in continual learning Ong, Yi Shen Lin Guosheng School of Computer Science and Engineering gslin@ntu.edu.sg Engineering::Computer science and engineering Continual learning in neural networks has been receiving increased interest due to how prevalent machine learning is in an increasing number of industries. Catastrophic forgetting, which is when a model forgets old tasks upon learning new tasks, is still a major roadblock in allowing neural networks to be truly life-long learners. A series of tests were conducted on the effectiveness of using buffers filled with old training data as a way of mitigating forgetting by training them alongside new data. The results are that increasing the size of the buffer does help mitigate forgetting at the cost of increased space used. Bachelor of Engineering (Computer Science) 2022-11-14T03:52:28Z 2022-11-14T03:52:28Z 2022 Final Year Project (FYP) Ong, Y. S. (2022). Preventing catastrophic forgetting in continual learning. Final Year Project (FYP), Nanyang Technological University, Singapore. https://hdl.handle.net/10356/162924 https://hdl.handle.net/10356/162924 en SCSE21-0626 application/pdf Nanyang Technological University |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Engineering::Computer science and engineering |
spellingShingle |
Engineering::Computer science and engineering Ong, Yi Shen Preventing catastrophic forgetting in continual learning |
description |
Continual learning in neural networks has been receiving increased interest due to
how prevalent machine learning is in an increasing number of industries.
Catastrophic forgetting, which is when a model forgets old tasks upon learning new
tasks, is still a major roadblock in allowing neural networks to be truly life-long
learners. A series of tests were conducted on the effectiveness of using buffers
filled with old training data as a way of mitigating forgetting by training them
alongside new data. The results are that increasing the size of the buffer does help
mitigate forgetting at the cost of increased space used. |
author2 |
Lin Guosheng |
author_facet |
Lin Guosheng Ong, Yi Shen |
format |
Final Year Project |
author |
Ong, Yi Shen |
author_sort |
Ong, Yi Shen |
title |
Preventing catastrophic forgetting in continual learning |
title_short |
Preventing catastrophic forgetting in continual learning |
title_full |
Preventing catastrophic forgetting in continual learning |
title_fullStr |
Preventing catastrophic forgetting in continual learning |
title_full_unstemmed |
Preventing catastrophic forgetting in continual learning |
title_sort |
preventing catastrophic forgetting in continual learning |
publisher |
Nanyang Technological University |
publishDate |
2022 |
url |
https://hdl.handle.net/10356/162924 |
_version_ |
1751548489385377792 |