Continual learning, fast and slow

According to the Complementary Learning Systems (CLS) theory (McClelland et al. 1995) in neuroscience, humans do effective continual learning through two complementary systems: a fast learning system centered on the hippocampus for rapid learning of the specifics, individual experiences; and a slow...

Full description

Saved in:
Bibliographic Details
Main Authors: PHAM, Quang Anh, LIU, Chenghao, HOI, Steven C. H.
Format: text
Language:English
Published: Institutional Knowledge at Singapore Management University 2024
Subjects:
Online Access:https://ink.library.smu.edu.sg/sis_research/8619
https://ink.library.smu.edu.sg/context/sis_research/article/9622/viewcontent/2209.02370_av.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Singapore Management University
Language: English
id sg-smu-ink.sis_research-9622
record_format dspace
spelling sg-smu-ink.sis_research-96222024-01-25T08:19:00Z Continual learning, fast and slow PHAM, Quang Anh LIU, Chenghao HOI, Steven C. H. According to the Complementary Learning Systems (CLS) theory (McClelland et al. 1995) in neuroscience, humans do effective continual learning through two complementary systems: a fast learning system centered on the hippocampus for rapid learning of the specifics, individual experiences; and a slow learning system located in the neocortex for the gradual acquisition of structured knowledge about the environment. Motivated by this theory, we propose DualNets (for Dual Networks), a general continual learning framework comprising a fast learning system for supervised learning of pattern-separated representation from specific tasks and a slow learning system for representation learning of task-agnostic general representation via Self-Supervised Learning (SSL). DualNets can seamlessly incorporate both representation types into a holistic framework to facilitate better continual learning in deep neural networks. Via extensive experiments, we demonstrate the promising results of DualNets on a wide range of continual learning protocols, ranging from the standard offline, task-aware setting to the challenging online, task-free scenario. Notably, on the CTrL (Veniat et al. 2020) benchmark that has unrelated tasks with vastly different visual images, DualNets can achieve competitive performance with existing state-of-the-art dynamic architecture strategies (Ostapenko et al. 2021). Furthermore, we conduct comprehensive ablation studies to validate DualNets efficacy, robustness, and scalability. 2024-01-01T08:00:00Z text application/pdf https://ink.library.smu.edu.sg/sis_research/8619 info:doi/10.1109/TPAMI.2023.3324203 https://ink.library.smu.edu.sg/context/sis_research/article/9622/viewcontent/2209.02370_av.pdf http://creativecommons.org/licenses/by-nc-nd/4.0/ Research Collection School Of Computing and Information Systems eng Institutional Knowledge at Singapore Management University Continual learning fast and slow learning Artificial Intelligence and Robotics Theory and Algorithms
institution Singapore Management University
building SMU Libraries
continent Asia
country Singapore
Singapore
content_provider SMU Libraries
collection InK@SMU
language English
topic Continual learning
fast and slow learning
Artificial Intelligence and Robotics
Theory and Algorithms
spellingShingle Continual learning
fast and slow learning
Artificial Intelligence and Robotics
Theory and Algorithms
PHAM, Quang Anh
LIU, Chenghao
HOI, Steven C. H.
Continual learning, fast and slow
description According to the Complementary Learning Systems (CLS) theory (McClelland et al. 1995) in neuroscience, humans do effective continual learning through two complementary systems: a fast learning system centered on the hippocampus for rapid learning of the specifics, individual experiences; and a slow learning system located in the neocortex for the gradual acquisition of structured knowledge about the environment. Motivated by this theory, we propose DualNets (for Dual Networks), a general continual learning framework comprising a fast learning system for supervised learning of pattern-separated representation from specific tasks and a slow learning system for representation learning of task-agnostic general representation via Self-Supervised Learning (SSL). DualNets can seamlessly incorporate both representation types into a holistic framework to facilitate better continual learning in deep neural networks. Via extensive experiments, we demonstrate the promising results of DualNets on a wide range of continual learning protocols, ranging from the standard offline, task-aware setting to the challenging online, task-free scenario. Notably, on the CTrL (Veniat et al. 2020) benchmark that has unrelated tasks with vastly different visual images, DualNets can achieve competitive performance with existing state-of-the-art dynamic architecture strategies (Ostapenko et al. 2021). Furthermore, we conduct comprehensive ablation studies to validate DualNets efficacy, robustness, and scalability.
format text
author PHAM, Quang Anh
LIU, Chenghao
HOI, Steven C. H.
author_facet PHAM, Quang Anh
LIU, Chenghao
HOI, Steven C. H.
author_sort PHAM, Quang Anh
title Continual learning, fast and slow
title_short Continual learning, fast and slow
title_full Continual learning, fast and slow
title_fullStr Continual learning, fast and slow
title_full_unstemmed Continual learning, fast and slow
title_sort continual learning, fast and slow
publisher Institutional Knowledge at Singapore Management University
publishDate 2024
url https://ink.library.smu.edu.sg/sis_research/8619
https://ink.library.smu.edu.sg/context/sis_research/article/9622/viewcontent/2209.02370_av.pdf
_version_ 1789483292916973568