Modified-LwF method for continual learning

In this dissertation, we show that it is possible to overcome the catastrophic forgetting with several different methods. What is more important is that our method remembers old tasks better by combining the original learning without forgetting and elastic weight consolidation, which is the main con...

Full description

Saved in:
Bibliographic Details
Main Author: Dang, Zhang
Other Authors: Ponnuthurai Nagaratnam Suganthan
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2022
Subjects:
Online Access:https://hdl.handle.net/10356/155416
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-155416
record_format dspace
spelling sg-ntu-dr.10356-1554162023-07-04T17:43:01Z Modified-LwF method for continual learning Dang, Zhang Ponnuthurai Nagaratnam Suganthan School of Electrical and Electronic Engineering EPNSugan@ntu.edu.sg Engineering::Electrical and electronic engineering In this dissertation, we show that it is possible to overcome the catastrophic forgetting with several different methods. What is more important is that our method remembers old tasks better by combining the original learning without forgetting and elastic weight consolidation, which is the main contribution that both the merits of elastic weight consolidation and learning without forgetting are put into one method (Modified LwF). Besides, the upper bound joint training method, fine tune, EWC and original LwF methods are experimented by adding the new tasks one by one. In this procedure, the paths of the training in the algorithm will be focused more on. We finally finished all four tasks, and the size of the fourth task is far bigger than the previous three. Master of Science (Computer Control and Automation) 2022-02-23T02:19:50Z 2022-02-23T02:19:50Z 2021 Thesis-Master by Coursework Dang, Z. (2021). Modified-LwF method for continual learning. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/155416 https://hdl.handle.net/10356/155416 en application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering
spellingShingle Engineering::Electrical and electronic engineering
Dang, Zhang
Modified-LwF method for continual learning
description In this dissertation, we show that it is possible to overcome the catastrophic forgetting with several different methods. What is more important is that our method remembers old tasks better by combining the original learning without forgetting and elastic weight consolidation, which is the main contribution that both the merits of elastic weight consolidation and learning without forgetting are put into one method (Modified LwF). Besides, the upper bound joint training method, fine tune, EWC and original LwF methods are experimented by adding the new tasks one by one. In this procedure, the paths of the training in the algorithm will be focused more on. We finally finished all four tasks, and the size of the fourth task is far bigger than the previous three.
author2 Ponnuthurai Nagaratnam Suganthan
author_facet Ponnuthurai Nagaratnam Suganthan
Dang, Zhang
format Thesis-Master by Coursework
author Dang, Zhang
author_sort Dang, Zhang
title Modified-LwF method for continual learning
title_short Modified-LwF method for continual learning
title_full Modified-LwF method for continual learning
title_fullStr Modified-LwF method for continual learning
title_full_unstemmed Modified-LwF method for continual learning
title_sort modified-lwf method for continual learning
publisher Nanyang Technological University
publishDate 2022
url https://hdl.handle.net/10356/155416
_version_ 1772825727350079488