Synaptic Plasticity Dynamics for Deep Continuous Local Learning (DECOLLE).
backpropagataon
embedded learning
neuromorphic hardware
spiking neural network
surrogate gradient algorithm
Journal
Frontiers in neuroscience
ISSN: 1662-4548
Titre abrégé: Front Neurosci
Pays: Switzerland
ID NLM: 101478481
Informations de publication
Date de publication:
2020
2020
Historique:
received:
27
11
2019
accepted:
07
04
2020
entrez:
2
6
2020
pubmed:
2
6
2020
medline:
2
6
2020
Statut:
epublish
Résumé
A growing body of work underlines striking similarities between biological neural networks and recurrent, binary neural networks. A relatively smaller body of work, however, addresses the similarities between learning dynamics employed in deep artificial neural networks and synaptic plasticity in spiking neural networks. The challenge preventing this is largely caused by the discrepancy between the dynamical properties of synaptic plasticity and the requirements for gradient backpropagation. Learning algorithms that approximate gradient backpropagation using local error functions can overcome this challenge. Here, we introduce Deep Continuous Local Learning (DECOLLE), a spiking neural network equipped with local error functions for online learning with no memory overhead for computing gradients. DECOLLE is capable of learning deep spatio temporal representations from spikes relying solely on local information, making it compatible with neurobiology and neuromorphic hardware. Synaptic plasticity rules are derived systematically from user-defined cost functions and neural dynamics by leveraging existing autodifferentiation methods of machine learning frameworks. We benchmark our approach on the event-based neuromorphic dataset N-MNIST and DvsGesture, on which DECOLLE performs comparably to the state-of-the-art. DECOLLE networks provide continuously learning machines that are relevant to biology and supportive of event-based, low-power computer vision architectures matching the accuracies of conventional computers on tasks where temporal precision and speed are essential.
Identifiants
pubmed: 32477050
doi: 10.3389/fnins.2020.00424
pmc: PMC7235446
doi:
Types de publication
Journal Article
Langues
eng
Pagination
424Informations de copyright
Copyright © 2020 Kaiser, Mostafa and Neftci.
Références
Front Neurosci. 2017 Jun 21;11:324
pubmed: 28680387
Nat Neurosci. 2010 Mar;13(3):344-52
pubmed: 20098420
Neuron. 2009 Aug 27;63(4):544-57
pubmed: 19709635
Front Neurosci. 2016 Nov 08;10:508
pubmed: 27877107
Neural Comput. 2018 Jun;30(6):1514-1541
pubmed: 29652587
Neural Comput. 2002 Nov;14(11):2531-60
pubmed: 12433288
Neural Comput. 2007 Nov;19(11):2881-912
pubmed: 17883345
Nat Neurosci. 2006 Mar;9(3):420-8
pubmed: 16474393
Nat Commun. 2016 Nov 08;7:13276
pubmed: 27824044
Science. 2014 Aug 8;345(6197):668-73
pubmed: 25104385
Front Neurosci. 2018 Aug 31;12:608
pubmed: 30233295
Neural Comput. 2006 Jun;18(6):1318-48
pubmed: 16764506
Neural Netw. 2017 Nov;95:110-133
pubmed: 28938130
Proc Natl Acad Sci U S A. 2016 Oct 11;113(41):11441-11446
pubmed: 27651489
Neuron. 2014 Feb 5;81(3):521-8
pubmed: 24507189
Neural Comput. 2011 Oct;23(10):2457-97
pubmed: 21732859
J Comput Neurosci. 2007 Dec;23(3):349-98
pubmed: 17629781
Front Neurosci. 2015 Nov 16;9:437
pubmed: 26635513