Lower and upper bounds for numbers of linear regions of graph convolutional networks.
Expressivity
GCNs
Graph convolutional networks
Graph neural networks
Linear regions
ReLU
Journal
Neural networks : the official journal of the International Neural Network Society
ISSN: 1879-2782
Titre abrégé: Neural Netw
Pays: United States
ID NLM: 8805018
Informations de publication
Date de publication:
Nov 2023
Nov 2023
Historique:
received:
05
04
2023
revised:
30
07
2023
accepted:
13
09
2023
medline:
13
11
2023
pubmed:
8
10
2023
entrez:
7
10
2023
Statut:
ppublish
Résumé
Graph neural networks (GNNs) have become a popular choice for analyzing graph data in the last few years, and characterizing their expressiveness has become an active area of research. One popular measure of expressiveness is the number of linear regions in neural networks with piecewise linear activations. In this paper, we present estimates for the number of linear regions in classic graph convolutional networks (GCNs) with one layer and multiple-layer scenarios and ReLU activation function. We derive an optimal upper bound for the maximum number of linear regions for one-layer GCNs and upper and lower bounds for multi-layer GCNs. Our simulated results suggest that the true maximum number of linear regions is likely closer to our estimated lower bound. These findings indicate that multi-layer GCNs have exponentially greater expressivity than one-layer GCNs per parameter, implying that deeper GCNs are more expressive than their shallow counterparts.
Identifiants
pubmed: 37804743
pii: S0893-6080(23)00519-1
doi: 10.1016/j.neunet.2023.09.025
pii:
doi:
Types de publication
Journal Article
Langues
eng
Sous-ensembles de citation
IM
Pagination
394-404Commentaires et corrections
Type : ErratumIn
Informations de copyright
Copyright © 2023 Elsevier Ltd. All rights reserved.
Déclaration de conflit d'intérêts
Declaration of competing interest The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.