Rotating neurons for all-analog implementation of cyclic reservoir computing.
Journal
Nature communications
ISSN: 2041-1723
Titre abrégé: Nat Commun
Pays: England
ID NLM: 101528555
Informations de publication
Date de publication:
23 03 2022
23 03 2022
Historique:
received:
28
08
2021
accepted:
28
02
2022
entrez:
24
3
2022
pubmed:
25
3
2022
medline:
13
4
2022
Statut:
epublish
Résumé
Hardware implementation in resource-efficient reservoir computing is of great interest for neuromorphic engineering. Recently, various devices have been explored to implement hardware-based reservoirs. However, most studies were mainly focused on the reservoir layer, whereas an end-to-end reservoir architecture has yet to be developed. Here, we propose a versatile method for implementing cyclic reservoirs using rotating elements integrated with signal-driven dynamic neurons, whose equivalence to standard cyclic reservoir algorithm is mathematically proven. Simulations show that the rotating neuron reservoir achieves record-low errors in a nonlinear system approximation benchmark. Furthermore, a hardware prototype was developed for near-sensor computing, chaotic time-series prediction and handwriting classification. By integrating a memristor array as a fully-connected output layer, the all-analog reservoir computing system achieves 94.0% accuracy, while simulation shows >1000× lower system-level power than prior works. Therefore, our work demonstrates an elegant rotation-based architecture that explores hardware physics as computational resources for high-performance reservoir computing.
Identifiants
pubmed: 35322037
doi: 10.1038/s41467-022-29260-1
pii: 10.1038/s41467-022-29260-1
pmc: PMC8943160
doi:
Types de publication
Journal Article
Research Support, Non-U.S. Gov't
Langues
eng
Sous-ensembles de citation
IM
Pagination
1549Informations de copyright
© 2022. The Author(s).
Références
IEEE Trans Neural Netw Learn Syst. 2015 Feb;26(2):388-93
pubmed: 25608295
Nat Commun. 2017 Dec 19;8(1):2204
pubmed: 29259188
Nat Commun. 2011 Sep 13;2:468
pubmed: 21915110
Sci Adv. 2020 Oct 9;6(41):
pubmed: 33036975
Opt Express. 2018 Mar 5;26(5):5777-5788
pubmed: 29529779
Sci Rep. 2016 Mar 03;6:22381
pubmed: 26935166
IEEE Trans Neural Netw. 2011 Jan;22(1):131-44
pubmed: 21075721
Front Neurosci. 2021 May 11;15:611300
pubmed: 34045939
Phys Rev Lett. 2018 Jan 12;120(2):024102
pubmed: 29376715
Nat Commun. 2021 Jan 18;12(1):408
pubmed: 33462233
Nat Commun. 2020 May 18;11(1):2473
pubmed: 32424184
Nat Commun. 2013;4:1364
pubmed: 23322052
Nature. 2020 Jan;577(7792):641-646
pubmed: 31996818
IEEE Trans Neural Netw Learn Syst. 2022 Apr;33(4):1688-1701
pubmed: 33351770
Nat Commun. 2020 Aug 25;11(1):4234
pubmed: 32843643
Sci Rep. 2015 Oct 08;5:14945
pubmed: 26446303
Neural Comput. 2002 Nov;14(11):2531-60
pubmed: 12433288
Sci Rep. 2012;2:287
pubmed: 22371825
Sci Adv. 2021 May 14;7(20):
pubmed: 33990331
Sci Rep. 2017 Aug 31;7(1):10199
pubmed: 28860513
Nature. 2017 Jul 26;547(7664):428-431
pubmed: 28748930
Opt Express. 2012 Sep 24;20(20):22783-95
pubmed: 23037429
Neural Netw. 2019 Jul;115:100-123
pubmed: 30981085
Nat Mater. 2022 Feb;21(2):195-202
pubmed: 34608285
Front Comput Neurosci. 2013 Jul 09;7:91
pubmed: 23847526
Science. 2004 Apr 2;304(5667):78-80
pubmed: 15064413