Class-incremental learning on multivariate time series via shape-aligned temporal distillation

Class-incremental learning (CIL) on multivariate time series (MTS) is an important yet understudied problem. Based on practical privacy-sensitive circumstances, we propose a novel distillation-based strategy using a single-headed classifier without saving historical samples. We propose to exploit So...

Full description

Saved in:
Bibliographic Details
Main Authors: Qiao, Zhongzheng, Hu, Minghui, Jiang, Xudong, Suganthan, Ponnuthurai Nagaratnam, Savitha, Ramasamy
Other Authors: School of Electrical and Electronic Engineering
Format: Conference or Workshop Item
Language:English
Published: 2023
Subjects:
Online Access:https://hdl.handle.net/10356/165392
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-165392
record_format dspace
spelling sg-ntu-dr.10356-1653922023-05-23T15:37:35Z Class-incremental learning on multivariate time series via shape-aligned temporal distillation Qiao, Zhongzheng Hu, Minghui Jiang, Xudong Suganthan, Ponnuthurai Nagaratnam Savitha, Ramasamy School of Electrical and Electronic Engineering Interdisciplinary Graduate School (IGS) 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2023) Institute for Infocomm Research, A*STAR CNRS@CREATE LTD, Singapore Energy Research Institute @ NTU (ERI@N) Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence Continual Learning Multivariate Time Series Classification Knowledge Distillation Dynamic Time Warping Class-incremental learning (CIL) on multivariate time series (MTS) is an important yet understudied problem. Based on practical privacy-sensitive circumstances, we propose a novel distillation-based strategy using a single-headed classifier without saving historical samples. We propose to exploit Soft-Dynamic Time Warping (Soft-DTW) for knowledge distillation, which aligns the feature maps along the temporal dimension before calculating the discrepancy. Compared with Euclidean distance, Soft-DTW shows its advantages in overcoming catastrophic forgetting and balancing the stability-plasticity dilemma. We construct two novel MTS-CIL benchmarks for comprehensive experiments. Combined with a prototype augmentation strategy, our framework demonstrates significant superiority over other prominent exemplar-free algorithms. National Research Foundation (NRF) Submitted/Accepted version This research is part of the programme DesCartes and is supported by the National Research Foundation, Prime Minister’s Office, Singapore under its Campus for Research Excellence and Technological Enterprise (CREATE) programme. 2023-05-23T06:55:51Z 2023-05-23T06:55:51Z 2023 Conference Paper Qiao, Z., Hu, M., Jiang, X., Suganthan, P. N. & Savitha, R. (2023). Class-incremental learning on multivariate time series via shape-aligned temporal distillation. 2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2023). https://dx.doi.org/10.1109/ICASSP49357.2023.10094960 978-1-7281-6327-7 https://hdl.handle.net/10356/165392 10.1109/ICASSP49357.2023.10094960 en © 2023 IEEE. Personal use of this material is permitted. Permission from IEEE must be obtained for all other uses, in any current or future media, including reprinting/republishing this material for advertising or promotional purposes, creating new collective works, for resale or redistribution to servers or lists, or reuse of any copyrighted component of this work in other works. The published version is available at: https://doi.org/10.1109/ICASSP49357.2023.10094960. application/pdf
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Continual Learning
Multivariate Time Series Classification
Knowledge Distillation
Dynamic Time Warping
spellingShingle Engineering::Computer science and engineering::Computing methodologies::Artificial intelligence
Continual Learning
Multivariate Time Series Classification
Knowledge Distillation
Dynamic Time Warping
Qiao, Zhongzheng
Hu, Minghui
Jiang, Xudong
Suganthan, Ponnuthurai Nagaratnam
Savitha, Ramasamy
Class-incremental learning on multivariate time series via shape-aligned temporal distillation
description Class-incremental learning (CIL) on multivariate time series (MTS) is an important yet understudied problem. Based on practical privacy-sensitive circumstances, we propose a novel distillation-based strategy using a single-headed classifier without saving historical samples. We propose to exploit Soft-Dynamic Time Warping (Soft-DTW) for knowledge distillation, which aligns the feature maps along the temporal dimension before calculating the discrepancy. Compared with Euclidean distance, Soft-DTW shows its advantages in overcoming catastrophic forgetting and balancing the stability-plasticity dilemma. We construct two novel MTS-CIL benchmarks for comprehensive experiments. Combined with a prototype augmentation strategy, our framework demonstrates significant superiority over other prominent exemplar-free algorithms.
author2 School of Electrical and Electronic Engineering
author_facet School of Electrical and Electronic Engineering
Qiao, Zhongzheng
Hu, Minghui
Jiang, Xudong
Suganthan, Ponnuthurai Nagaratnam
Savitha, Ramasamy
format Conference or Workshop Item
author Qiao, Zhongzheng
Hu, Minghui
Jiang, Xudong
Suganthan, Ponnuthurai Nagaratnam
Savitha, Ramasamy
author_sort Qiao, Zhongzheng
title Class-incremental learning on multivariate time series via shape-aligned temporal distillation
title_short Class-incremental learning on multivariate time series via shape-aligned temporal distillation
title_full Class-incremental learning on multivariate time series via shape-aligned temporal distillation
title_fullStr Class-incremental learning on multivariate time series via shape-aligned temporal distillation
title_full_unstemmed Class-incremental learning on multivariate time series via shape-aligned temporal distillation
title_sort class-incremental learning on multivariate time series via shape-aligned temporal distillation
publishDate 2023
url https://hdl.handle.net/10356/165392
_version_ 1772827519244828672