Self-consistent learning of neural dynamical systems from noisy time series

We introduce a new method which, for a single noisy time series, provides unsupervised filtering, state space reconstruction, efficient learning of the unknown governing multivariate dynamical system, and deterministic forecasting. We construct both the underlying trajectories and a latent dynamical...

Full description

Saved in:
Bibliographic Details
Main Authors: Wang, Zhe, Guet, Claude
Other Authors: School of Physical and Mathematical Sciences
Format: Article
Language:English
Published: 2022
Subjects:
Online Access:https://hdl.handle.net/10356/162829
Tags: Add Tag
No Tags, Be the first to tag this record!
Institution: Nanyang Technological University
Language: English
id sg-ntu-dr.10356-162829
record_format dspace
spelling sg-ntu-dr.10356-1628292022-11-10T08:17:10Z Self-consistent learning of neural dynamical systems from noisy time series Wang, Zhe Guet, Claude School of Physical and Mathematical Sciences Energy Research Institute @ NTU (ERI@N) Engineering::Computer science and engineering Time Series Analysis Dynamical Systems We introduce a new method which, for a single noisy time series, provides unsupervised filtering, state space reconstruction, efficient learning of the unknown governing multivariate dynamical system, and deterministic forecasting. We construct both the underlying trajectories and a latent dynamical system using deep neural networks. Under the assumption that the trajectories follow the latent dynamical system, we determine the unknowns of the dynamical system, and filter out stochastic outliers in the measurements. In this sense the method is self-consistent. The embedding dimension is determined iteratively during the training by using the false-nearest-neighbors Algorithm and it is implemented as an attention map to the state vector. This allows for a state space reconstruction without a priori information on the signal. By exploiting the differentiability of the neural solution trajectory, we can define the neural dynamical system locally at each time, mitigating the need for forward and backwards passing through numerical solvers of the canonical adjoint method. On a chaotic time series masked by additive Gaussian noise, we demonstrate that the denoising ability and the predictive power of the proposed method are mainly due to the self-consistency, insensitive to methods used for the state space reconstruction. Nanyang Technological University National Research Foundation (NRF) The work of Zhe Wang was supported by Energy Research Institute@NTU, Nanyang Technological University, where most of this work was performed. This work was supported by the National Research Foundation, Prime Minister’s Office, Singapore under its Campus for Research Excellence and Technological Enterprise (CREATE) programme. 2022-11-10T08:17:10Z 2022-11-10T08:17:10Z 2022 Journal Article Wang, Z. & Guet, C. (2022). Self-consistent learning of neural dynamical systems from noisy time series. IEEE Transactions On Emerging Topics in Computational Intelligence, 6(5), 1103-1112. https://dx.doi.org/10.1109/TETCI.2022.3146332 2471-285X https://hdl.handle.net/10356/162829 10.1109/TETCI.2022.3146332 2-s2.0-85124820022 5 6 1103 1112 en IEEE Transactions on Emerging Topics in Computational Intelligence © 2022 IEEE. All rights reserved.
institution Nanyang Technological University
building NTU Library
continent Asia
country Singapore
Singapore
content_provider NTU Library
collection DR-NTU
language English
topic Engineering::Computer science and engineering
Time Series Analysis
Dynamical Systems
spellingShingle Engineering::Computer science and engineering
Time Series Analysis
Dynamical Systems
Wang, Zhe
Guet, Claude
Self-consistent learning of neural dynamical systems from noisy time series
description We introduce a new method which, for a single noisy time series, provides unsupervised filtering, state space reconstruction, efficient learning of the unknown governing multivariate dynamical system, and deterministic forecasting. We construct both the underlying trajectories and a latent dynamical system using deep neural networks. Under the assumption that the trajectories follow the latent dynamical system, we determine the unknowns of the dynamical system, and filter out stochastic outliers in the measurements. In this sense the method is self-consistent. The embedding dimension is determined iteratively during the training by using the false-nearest-neighbors Algorithm and it is implemented as an attention map to the state vector. This allows for a state space reconstruction without a priori information on the signal. By exploiting the differentiability of the neural solution trajectory, we can define the neural dynamical system locally at each time, mitigating the need for forward and backwards passing through numerical solvers of the canonical adjoint method. On a chaotic time series masked by additive Gaussian noise, we demonstrate that the denoising ability and the predictive power of the proposed method are mainly due to the self-consistency, insensitive to methods used for the state space reconstruction.
author2 School of Physical and Mathematical Sciences
author_facet School of Physical and Mathematical Sciences
Wang, Zhe
Guet, Claude
format Article
author Wang, Zhe
Guet, Claude
author_sort Wang, Zhe
title Self-consistent learning of neural dynamical systems from noisy time series
title_short Self-consistent learning of neural dynamical systems from noisy time series
title_full Self-consistent learning of neural dynamical systems from noisy time series
title_fullStr Self-consistent learning of neural dynamical systems from noisy time series
title_full_unstemmed Self-consistent learning of neural dynamical systems from noisy time series
title_sort self-consistent learning of neural dynamical systems from noisy time series
publishDate 2022
url https://hdl.handle.net/10356/162829
_version_ 1751548593686183936