Modified-LwF method for continual learning

In this dissertation, we show that it is possible to overcome the catastrophic forgetting with several different methods. What is more important is that our method remembers old tasks better by combining the original learning without forgetting and elastic weight consolidation, which is the main con...

Full description

Saved in:
Bibliographic Details
Main Author: Dang, Zhang
Other Authors: Ponnuthurai Nagaratnam Suganthan
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2022
Subjects:
Online Access:https://hdl.handle.net/10356/155416
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
Description
Summary:In this dissertation, we show that it is possible to overcome the catastrophic forgetting with several different methods. What is more important is that our method remembers old tasks better by combining the original learning without forgetting and elastic weight consolidation, which is the main contribution that both the merits of elastic weight consolidation and learning without forgetting are put into one method (Modified LwF). Besides, the upper bound joint training method, fine tune, EWC and original LwF methods are experimented by adding the new tasks one by one. In this procedure, the paths of the training in the algorithm will be focused more on. We finally finished all four tasks, and the size of the fourth task is far bigger than the previous three.