Dimension reduction in recurrent networks by canonicalization
Many recurrent neural network machine learning paradigms can be formulated using state-space representations. The classical notion of canonical state-space realization is adapted in this paper to accommodate semi-infinite inputs so that it can be used as a dimension reduction tool in the recurrent n...
Saved in:
Main Authors: | Grigoryeva, Lyudmila, Ortega, Juan-Pablo |
---|---|
Other Authors: | School of Physical and Mathematical Sciences |
Format: | Article |
Language: | English |
Published: |
2022
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/161577 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
Similar Items
-
Infinite-dimensional reservoir computing
by: Gonon, Lukas, et al.
Published: (2024) -
Learning strange attractors with reservoir systems
by: Grigoryeva, Lyudmila, et al.
Published: (2023) -
Training issues and learning algorithms for feedforward and recurrent neural networks
by: TEOH EU JIN
Published: (2010) -
Temporal Spiking Recurrent Neural Network for Action Recognition
by: Wang, W., et al.
Published: (2022) -
Transport in reservoir computing
by: Manjunath, G., et al.
Published: (2023)