Learning elastic memory online for fast time series forecasting

It is well known that any kind of time series algorithm requires past information to model the inherent temporal relationship between past and future. This temporal dependency (i.e. number of past samples required for a good prediction) is generally addressed by feeding a number of past instances to...

Full description

Saved in:
Bibliographic Details
Main Authors: Samanta, Subhrajit, Pratama, Mahardhika, Sundaram, Suresh, Srikanth, Narasimalu
Other Authors: School of Computer Science and Engineering
Format: Article
Language:English
Published: 2022
Subjects:
Online Access:https://hdl.handle.net/10356/160973
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-160973
record_format dspace
spelling sg-ntu-dr.10356-1609732022-08-10T02:11:35Z Learning elastic memory online for fast time series forecasting Samanta, Subhrajit Pratama, Mahardhika Sundaram, Suresh Srikanth, Narasimalu School of Computer Science and Engineering Interdisciplinary Graduate School (IGS) Energy Research Institute @ NTU (ERI@N) Engineering::Computer science and engineering Time Series Forecasting Temporality Determination It is well known that any kind of time series algorithm requires past information to model the inherent temporal relationship between past and future. This temporal dependency (i.e. number of past samples required for a good prediction) is generally addressed by feeding a number of past instances to the model in an empirical manner. Conventional approaches mostly rely on offline model, making them impractical to be adopted in the online or streaming context. Hence, a novel method of online temporality analysis is proposed in this paper. The estimated temporality is then employed to form an Adaptive Temporal Neural Network (ATNN) with an elastic memory capable of automatically selecting number of past samples to be used. Temporality change or drift can be a common occurrence in data streams. Hence a drift detection mechanism is also proposed. Once such drift is detected, a drift handling mechanism kicks in which utilizes the rate of drift, making our solution truly autonomous. The entire mechanism is termed as LEMON: Learning Elastic Memory Online. LEMON although not a time series model in itself, can work with any predictive models to improve their performance. Synthetic datasets are used here as proof of correct temporality estimation and drift detection whereas real world datasets are employed to demonstrate how LEMON improves the predictive performance and speed of an existing model with the knowledge of temporality and drift. Nanyang Technological University We would like to sincerely thank Energy Research Institute at Nanyang Technological University (ERI@N), Singapore for their continued support. The work was funded by their SMES project. 2022-08-10T02:11:35Z 2022-08-10T02:11:35Z 2020 Journal Article Samanta, S., Pratama, M., Sundaram, S. & Srikanth, N. (2020). Learning elastic memory online for fast time series forecasting. Neurocomputing, 390, 315-326. https://dx.doi.org/10.1016/j.neucom.2019.07.105 0925-2312 https://hdl.handle.net/10356/160973 10.1016/j.neucom.2019.07.105 2-s2.0-85074473681 390 315 326 en Neurocomputing © 2019 Elsevier B.V. All rights reserved.
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Computer science and engineering
Time Series Forecasting
Temporality Determination
spellingShingle Engineering::Computer science and engineering
Time Series Forecasting
Temporality Determination
Samanta, Subhrajit
Pratama, Mahardhika
Sundaram, Suresh
Srikanth, Narasimalu
Learning elastic memory online for fast time series forecasting
description It is well known that any kind of time series algorithm requires past information to model the inherent temporal relationship between past and future. This temporal dependency (i.e. number of past samples required for a good prediction) is generally addressed by feeding a number of past instances to the model in an empirical manner. Conventional approaches mostly rely on offline model, making them impractical to be adopted in the online or streaming context. Hence, a novel method of online temporality analysis is proposed in this paper. The estimated temporality is then employed to form an Adaptive Temporal Neural Network (ATNN) with an elastic memory capable of automatically selecting number of past samples to be used. Temporality change or drift can be a common occurrence in data streams. Hence a drift detection mechanism is also proposed. Once such drift is detected, a drift handling mechanism kicks in which utilizes the rate of drift, making our solution truly autonomous. The entire mechanism is termed as LEMON: Learning Elastic Memory Online. LEMON although not a time series model in itself, can work with any predictive models to improve their performance. Synthetic datasets are used here as proof of correct temporality estimation and drift detection whereas real world datasets are employed to demonstrate how LEMON improves the predictive performance and speed of an existing model with the knowledge of temporality and drift.
author2 School of Computer Science and Engineering
author_facet School of Computer Science and Engineering
Samanta, Subhrajit
Pratama, Mahardhika
Sundaram, Suresh
Srikanth, Narasimalu
format Article
author Samanta, Subhrajit
Pratama, Mahardhika
Sundaram, Suresh
Srikanth, Narasimalu
author_sort Samanta, Subhrajit
title Learning elastic memory online for fast time series forecasting
title_short Learning elastic memory online for fast time series forecasting
title_full Learning elastic memory online for fast time series forecasting
title_fullStr Learning elastic memory online for fast time series forecasting
title_full_unstemmed Learning elastic memory online for fast time series forecasting
title_sort learning elastic memory online for fast time series forecasting
publishDate 2022
url https://hdl.handle.net/10356/160973
_version_ 1743119520938065920