Unsupervised generative variational continual learning

Continual learning aims at learning a sequence of tasks without forgetting any task. There are mainly three categories in this field: replay methods, regularization-based methods, and parameter isolation methods. Recent research in continual learning generally incorporates two of these methods to ob...

Full description

Saved in:
Bibliographic Details
Main Author: Liu, Guimeng
Other Authors: Ponnuthurai Nagaratnam Suganthan
Format: Thesis-Master by Coursework
Language:English
Published: Nanyang Technological University 2023
Subjects:
Online Access:https://hdl.handle.net/10356/164770
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-164770
record_format dspace
spelling sg-ntu-dr.10356-1647702023-04-10T02:41:09Z Unsupervised generative variational continual learning Liu, Guimeng Ponnuthurai Nagaratnam Suganthan School of Electrical and Electronic Engineering Agency for Science, Technology and Research (A*STAR) EPNSugan@ntu.edu.sg Engineering::Electrical and electronic engineering Continual learning aims at learning a sequence of tasks without forgetting any task. There are mainly three categories in this field: replay methods, regularization-based methods, and parameter isolation methods. Recent research in continual learning generally incorporates two of these methods to obtain better performance. This dissertation combined regularization-based methods and parameter isolation methods to ensure the important parameters for each task do not change drastically and free up unimportant parameters so the network is capable to learn new knowledge. While most of the existing literature on continual learning is aimed at class incremental learning in a supervised setting, there is enormous potential for unsupervised continual learning using generative models. This dissertation proposes a combination of architectural pruning and network expansion in generative variational models toward unsupervised generative continual learning (UGCL). Evaluations on standard benchmark data sets demonstrate the superior generative ability of the proposed method. Master of Science (Computer Control and Automation) 2023-02-14T02:31:24Z 2023-02-14T02:31:24Z 2023 Thesis-Master by Coursework Liu, G. (2023). Unsupervised generative variational continual learning. Master's thesis, Nanyang Technological University, Singapore. https://hdl.handle.net/10356/164770 https://hdl.handle.net/10356/164770 10.1109/ICIP46576.2022.9897538 en application/pdf Nanyang Technological University
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Electrical and electronic engineering
spellingShingle Engineering::Electrical and electronic engineering
Liu, Guimeng
Unsupervised generative variational continual learning
description Continual learning aims at learning a sequence of tasks without forgetting any task. There are mainly three categories in this field: replay methods, regularization-based methods, and parameter isolation methods. Recent research in continual learning generally incorporates two of these methods to obtain better performance. This dissertation combined regularization-based methods and parameter isolation methods to ensure the important parameters for each task do not change drastically and free up unimportant parameters so the network is capable to learn new knowledge. While most of the existing literature on continual learning is aimed at class incremental learning in a supervised setting, there is enormous potential for unsupervised continual learning using generative models. This dissertation proposes a combination of architectural pruning and network expansion in generative variational models toward unsupervised generative continual learning (UGCL). Evaluations on standard benchmark data sets demonstrate the superior generative ability of the proposed method.
author2 Ponnuthurai Nagaratnam Suganthan
author_facet Ponnuthurai Nagaratnam Suganthan
Liu, Guimeng
format Thesis-Master by Coursework
author Liu, Guimeng
author_sort Liu, Guimeng
title Unsupervised generative variational continual learning
title_short Unsupervised generative variational continual learning
title_full Unsupervised generative variational continual learning
title_fullStr Unsupervised generative variational continual learning
title_full_unstemmed Unsupervised generative variational continual learning
title_sort unsupervised generative variational continual learning
publisher Nanyang Technological University
publishDate 2023
url https://hdl.handle.net/10356/164770
_version_ 1764208162236268544