Infinite-dimensional reservoir computing
Reservoir computing approximation and generalization bounds are proved for a new concept class of input/output systems that extends the so-called generalized Barron functionals to a dynamic context. This new class is characterized by the readouts with a certain integral representation built on infin...
Saved in:
Main Authors: | , , |
---|---|
Other Authors: | |
Format: | Article |
Language: | English |
Published: |
2024
|
Subjects: | |
Online Access: | https://hdl.handle.net/10356/180739 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Institution: | Nanyang Technological University |
Language: | English |
id |
sg-ntu-dr.10356-180739 |
---|---|
record_format |
dspace |
spelling |
sg-ntu-dr.10356-1807392024-10-22T06:18:40Z Infinite-dimensional reservoir computing Gonon, Lukas Grigoryeva, Lyudmila Ortega, Juan-Pablo School of Physical and Mathematical Sciences Mathematical Sciences Recurrent neural network Reservoir computing Reservoir computing approximation and generalization bounds are proved for a new concept class of input/output systems that extends the so-called generalized Barron functionals to a dynamic context. This new class is characterized by the readouts with a certain integral representation built on infinite-dimensional state-space systems. It is shown that this class is very rich and possesses useful features and universal approximation properties. The reservoir architectures used for the approximation and estimation of elements in the new class are randomly generated echo state networks with either linear or ReLU activation functions. Their readouts are built using randomly generated neural networks in which only the output layer is trained (extreme learning machines or random feature neural networks). The results in the paper yield a recurrent neural network-based learning algorithm with provable convergence guarantees that do not suffer from the curse of dimensionality when learning input/output systems in the class of generalized Barron functionals and measuring the error in a mean-squared sense. The authors acknowledge partial financial support coming from the Swiss National Science Foundation (grant number 200021_175801/1). 2024-10-22T06:18:40Z 2024-10-22T06:18:40Z 2024 Journal Article Gonon, L., Grigoryeva, L. & Ortega, J. (2024). Infinite-dimensional reservoir computing. Neural Networks, 179, 106486-. https://dx.doi.org/10.1016/j.neunet.2024.106486 0893-6080 https://hdl.handle.net/10356/180739 10.1016/j.neunet.2024.106486 38986185 2-s2.0-85197740765 179 106486 en Neural Networks © 2024 Elsevier Ltd. All rights are reserved, including those for text and data mining, AI training, and similar technologies. |
institution |
Nanyang Technological University |
building |
NTU Library |
continent |
Asia |
country |
Singapore Singapore |
content_provider |
NTU Library |
collection |
DR-NTU |
language |
English |
topic |
Mathematical Sciences Recurrent neural network Reservoir computing |
spellingShingle |
Mathematical Sciences Recurrent neural network Reservoir computing Gonon, Lukas Grigoryeva, Lyudmila Ortega, Juan-Pablo Infinite-dimensional reservoir computing |
description |
Reservoir computing approximation and generalization bounds are proved for a new concept class of input/output systems that extends the so-called generalized Barron functionals to a dynamic context. This new class is characterized by the readouts with a certain integral representation built on infinite-dimensional state-space systems. It is shown that this class is very rich and possesses useful features and universal approximation properties. The reservoir architectures used for the approximation and estimation of elements in the new class are randomly generated echo state networks with either linear or ReLU activation functions. Their readouts are built using randomly generated neural networks in which only the output layer is trained (extreme learning machines or random feature neural networks). The results in the paper yield a recurrent neural network-based learning algorithm with provable convergence guarantees that do not suffer from the curse of dimensionality when learning input/output systems in the class of generalized Barron functionals and measuring the error in a mean-squared sense. |
author2 |
School of Physical and Mathematical Sciences |
author_facet |
School of Physical and Mathematical Sciences Gonon, Lukas Grigoryeva, Lyudmila Ortega, Juan-Pablo |
format |
Article |
author |
Gonon, Lukas Grigoryeva, Lyudmila Ortega, Juan-Pablo |
author_sort |
Gonon, Lukas |
title |
Infinite-dimensional reservoir computing |
title_short |
Infinite-dimensional reservoir computing |
title_full |
Infinite-dimensional reservoir computing |
title_fullStr |
Infinite-dimensional reservoir computing |
title_full_unstemmed |
Infinite-dimensional reservoir computing |
title_sort |
infinite-dimensional reservoir computing |
publishDate |
2024 |
url |
https://hdl.handle.net/10356/180739 |
_version_ |
1814777759376867328 |