Graph neural networks (GNNs) have become a popular choice for analyzing graph data in the last few years, and characterizing their expressiveness has become an active area of research. One popular measure of expressiveness is the number of linear regions in neural networks with piecewise linear activations. In this paper, we present estimates for the number of linear regions in classic graph convolutional networks (GCNs) with one layer and multiple-layer scenarios and ReLU activation function. We derive an optimal upper bound for the maximum number of linear regions for one-layer GCNs and upper and lower bounds for multi-layer GCNs. Our simulated results suggest that the true maximum number of linear regions is likely closer to our estimated lower bound. These findings indicate that multi-layer GCNs have exponentially greater expressivity than one-layer GCNs per parameter, implying that deeper GCNs are more expressive than their shallow counterparts.

Download full-text PDF

Source
http://dx.doi.org/10.1016/j.neunet.2023.09.025DOI Listing

Publication Analysis

Top Keywords

linear regions
20
number linear
16
graph convolutional
8
convolutional networks
8
neural networks
8
maximum number
8
one-layer gcns
8
multi-layer gcns
8
linear
6
gcns
6

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!