Existence of reservoir with finite-dimensional output for universal reservoir computing.
Machine learning
Neural network
Nonlinear dynamical system
Reservoir computing
Journal
Scientific reports
ISSN: 2045-2322
Titre abrégé: Sci Rep
Pays: England
ID NLM: 101563288
Informations de publication
Date de publication:
11 Apr 2024
11 Apr 2024
Historique:
received:
04
09
2023
accepted:
11
03
2024
medline:
11
4
2024
pubmed:
11
4
2024
entrez:
10
4
2024
Statut:
epublish
Résumé
In this paper, we prove the existence of a reservoir that has a finite-dimensional output and makes the reservoir computing model universal. Reservoir computing is a method for dynamical system approximation that trains the static part of a model but fixes the dynamical part called the reservoir. Hence, reservoir computing has the advantage of training models with a low computational cost. Moreover, fixed reservoirs can be implemented as physical systems. Such reservoirs have attracted attention in terms of computation speed and energy consumption. The universality of a reservoir computing model is its ability to approximate an arbitrary system with arbitrary accuracy. Two sufficient reservoir conditions to make the model universal have been proposed. The first is the combination of fading memory and the separation property. The second is the neighborhood separation property, which we proposed recently. To date, it has been unknown whether a reservoir with a finite-dimensional output can satisfy these conditions. In this study, we prove that no reservoir with a finite-dimensional output satisfies the former condition. By contrast, we propose a single output reservoir that satisfies the latter condition. This implies that, for any dimension, a reservoir making the model universal exists with the output of that specified dimension. These results clarify the practical importance of our proposed conditions.
Identifiants
pubmed: 38600157
doi: 10.1038/s41598-024-56742-7
pii: 10.1038/s41598-024-56742-7
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Pagination
8448Subventions
Organisme : Japan Society for the Promotion of Science
ID : JP22K04027
Organisme : JST FOREST Program
ID : JPMJFR2123
Informations de copyright
© 2024. The Author(s).
Références
Jaeger, H. The “echo state” approach to analysing and training recurrent neural networks—with an erratum note. German National Research Center for Information Technology GMD Technical Report, 148.34 (2001).
Maass, W. & Natschl, T. Real-time computing without stable states: A new framework for neural computation based on perturbations. Neural Comput. 14(11), 2531–2560 (2002).
doi: 10.1162/089976602760407955
pubmed: 12433288
Steil, J. J. Backpropagation-decorrelation: Online recurrent learning with O(N) complexity. In 2004 IEEE International Joint Conference on Neural Networks 843–848 (2004).
Verstraeten, D., Schrauwen, B., D’Haene, M. & Stroobandt, D. An experimental unification of reservoir computing methods. Neural Netw. 20(3), 391–403 (2007).
doi: 10.1016/j.neunet.2007.04.003
pubmed: 17517492
Lukoševičius, M. & Jaeger, H. Reservoir computing approaches to recurrent neural network training. Comput. Sci. Rev. 3(3), 127–149 (2009).
doi: 10.1016/j.cosrev.2009.03.005
Williams, R. J. & Zipser, D. A learning algorithm for continually running fully recurrent neural networks. Neural Comput. 1(2), 270–280 (1989).
doi: 10.1162/neco.1989.1.2.270
Werbos, P. J. Backpropagation through time: what it does and how to do it. Proc. IEEE 78(10), 1550–1560 (1990).
doi: 10.1109/5.58337
Tanaka, G. et al. Recent advances in physical reservoir computing: A review. Neural Netw. 115, 100–123 (2019).
doi: 10.1016/j.neunet.2019.03.005
pubmed: 30981085
Friedman, J. S. Unsupervised learning & reservoir computing leveraging analog spintronic phenomena. IEEE 16th Nanotechnology Materials and Devices Conference 1–2 (2021).
Stelzer, F., Röhm, A., Lüdge, K. & Yanchuk, S. Performance boost of time-delay reservoir computing by non-resonant clock cycle. Neural Netw. 124, 158–169 (2020).
doi: 10.1016/j.neunet.2020.01.010
pubmed: 32006747
Soriano, M. C. et al. Delay-based reservoir computing: Noise effects in a combined analog and digital implementation. IEEE Trans. Neural Netw. Learn. Syst. 26(2), 388–393 (2014).
doi: 10.1109/TNNLS.2014.2311855
Dong, J., Rafayelyan, M., Krzakala, F. & Gigan, S. Optical reservoir computing using multiple light scattering for chaotic systems prediction. IEEE J. Sel. Top. Quantum Electron. 26(1), 1–12 (2020).
doi: 10.1109/JSTQE.2019.2936281
Nakajima, K. & Fischer, I. Reservoir Computing (Springer, 2021).
doi: 10.1007/978-981-13-1687-6
Grigoryeva, L. & Ortega, J. P. Echo state networks are universal. Neural Netw. 108, 495–508 (2018).
doi: 10.1016/j.neunet.2018.08.025
pubmed: 30317134
Gonon, L. & Ortega, J. P. Reservoir Computing Universality With Stochastic Inputs. IEEE Trans. Neural Netw. Learn. Syst. 31(1), 100–112 (2020).
doi: 10.1109/TNNLS.2019.2899649
pubmed: 30892244
Sugiura, S., Ariizumi, R., Asai, T., & Azuma, S. Nonessentiality of reservoir’s fading memory for universality of reservoir computing. IEEE Trans. Neural Netw. https://doi.org/10.1109/TNNLS.2023.3298013 (2023).
Fernando, C. & Sojakka, S. Pattern recognition in a bucket. In European Conference on Artificial Life 588–597 (Springer, 2003).
Boyd, S. & Chua, L. O. Fading memory and the problem of approximating nonlinear operators with Volterra series. IEEE Trans. Circuits Syst. 32(11), 1150–1161 (1985).
doi: 10.1109/TCS.1985.1085649
Engelking, R. Dimension Theory (North-Holland Publishing Company, 1978).
Hocking, J. G. & Young, G. S. Topology (Addison-Wesley Publishing Company, 1961).
Jensen, J. H. & Tufte, G. Reservoir computing with a chaotic circuit. In Artificial Life Conference Proceedings 222–229 (MIT Press, 2017).
Choi, J. & Kim, P. Reservoir computing based on quenched chaos. Chaos Solitons Fractals 140, 110131 (2020).
doi: 10.1016/j.chaos.2020.110131