On Neural Network Kernels and the Storage Capacity Problem.
Journal
Neural computation
ISSN: 1530-888X
Titre abrégé: Neural Comput
Pays: United States
ID NLM: 9426182
Informations de publication
Date de publication:
15 04 2022
15 04 2022
Historique:
received:
14
11
2021
accepted:
13
01
2022
pubmed:
29
3
2022
medline:
27
4
2022
entrez:
28
3
2022
Statut:
ppublish
Résumé
In this short note, we reify the connection between work on the storage capacity problem in wide two-layer treelike neural networks and the rapidly growing body of literature on kernel limits of wide neural networks. Concretely, we observe that the "effective order parameter" studied in the statistical mechanics literature is exactly equivalent to the infinite-width neural network gaussian process kernel. This correspondence connects the expressivity and trainability of wide two-layer neural networks.
Identifiants
pubmed: 35344992
pii: 110043
doi: 10.1162/neco_a_01494
doi:
Types de publication
Journal Article
Research Support, Non-U.S. Gov't
Langues
eng
Sous-ensembles de citation
IM
Pagination
1136-1142Informations de copyright
© 2022 Massachusetts Institute of Technology.