POOLING STRATEGIES FOR SIMPLICIAL CONVOLUTIONAL NETWORKS
Domenico Mattia Cinque, Claudio Battiloro, Paolo Di Lorenzo
DIET Department, Sapienza University of Rome, Via Eudossiana 18, 00184, Rome, Italy
E-mail: domenico.cinque98@gmail.com, {claudio.battiloro, paolo.dilorenzo}@uniroma1.it
ABSTRACT
The goal of this paper is to introduce pooling strategies for sim-
plicial convolutional neural networks. Inspired by graph pool-
ing methods, we introduce a general formulation for a simpli-
cial pooling layer that performs: i) local aggregation of sim-
plicial signals; ii) principled selection of sampling sets; iii)
downsampling and simplicial topology adaptation. The gen-
eral layer is then customized to design four different pooling
strategies (i.e., max, top-k, self-attention, and separated top-k)
grounded in the theory of topological signal processing. Also,
we leverage the proposed layers in a hierarchical architecture
that reduce complexity while representing data at different res-
olutions. Numerical results on real data benchmarks (i.e., flow
and graph classification) illustrate the advantage of the pro-
posed methods with respect to the state of the art.
Index Terms—Topological signal processing, topological
deep learning, simplicial neural networks, pooling.
1. INTRODUCTION
In the last years, Graph Neural Networks (GNNs) [1–3] have
shown remarkable results in learning tasks involving data de-
fined on irregular domains (e.g., graphs), such as social net-
works, recommender systems, cybersecurity, natural language
processing, genomics, and many more [3, 4]. However, GNNs
are designed to work with graphs, which consider only pairwise
relationships between data. On the contrary, many real-world
phenomena involve multi-way relationships as, e.g., in biologi-
cal or social networks. Some recent works in topological signal
processing [5, 6] have shown that multi-way relationships can
be described using simplicial complexes, which are specific in-
stances of hyper-graphs with powerful algebraic representation
able to model higher-order interactions among nodes. Conse-
quently, there was also a raising interest in the development of
(deep) neural network architectures able to handle data defined
on topological spaces, as summarized in the sequel.
Related works. Despite its recent birth, many contributions
have been made to the field of simplicial deep learning. In
[7], the authors introduced a basic simplicial neural network
(SNN) architecture that performs convolution exploiting high-
order Laplacians without independently exploiting upper and
lower neighbourhoods. In [8], message passing neural net-
works (MPNNs) are adapted to simplicial complexes, with the
aggregation and updating functions taking into account data de-
fined on adjacent simplices, enabling message exchange even
among signals of different orders. The work in [9] exploits
the simplicial filters introduced in [10] to design a flexible and
low-complexity simplicial convolutional networks (SCNs) with
spectral interpretability. Finally, in [11, 12], simplicial atten-
tional architectures are introduced.
Motivated by the fact that, both in convolutional neural net-
works (CNNs) and in GNNs, the introduction of pooling lay-
ers was proved to be useful for reducing the number of model
parameters while improving the learning performance, in this
work we aim to endow SCNs with pooling strategies. How-
ever, while for CNNs the pooling operation relies on aggrega-
tion based on the natural local neighbourhood provided by the
regular grid domain, even on simpler graph domains the defini-
tion of local patches is not straightforward. Early works tried to
overcome this issue by using graph clustering algorithms such
as GraClus [13] or spectral methods [14] to produce a node
assignment that generalizes the notion of locality present in
regular domains. The most recent trends are instead focused
on differentiable learnable operators that can learn a node as-
signment [15], or simply keep some nodes while discarding the
others [16, 17]. Other works [18] discuss the class of global
pooling methods that reduce the graph to a single vector, ignor-
ing topological information. To the best of our knowledge, no
previous works tackled the problem of pooling for SCNs.
Contribution. The goal of this work is to introduce pooling
strategies for SCNs. Taking inspiration from the select-reduce-
connect (SRC) paradigm [19], we introduce a general simpli-
cial pooling layer that comprises three steps: i) a local aggre-
gation step responsible for providing a meaningful summary of
the input signals; ii) a selection step responsible for selecting
a proper subset of simplices; finally, iii) a reduction step that
downsamples the input complex and the aggregated signals of
step i) based on the simplices selected in step ii). By tailoring
steps ii) and iii), we introduce four different simplicial pool-
ing layers that generalize the well-known graph pooling strate-
gies. Also, we exploit the proposed simplicial pooling lay-
ers in a jumping knowledge (JK) hierarchical architecture [20],
which aggregates the intermediate embeddings produced by the
simplicial pooling layers to produce the final output. Finally,
we assess the performance of the proposed methods on real-
world graph and trajectory classification tasks, showing favor-
able comparisons with respect to other techniques in terms of
performance and robustness to compression.
2. BACKGROUND
Simplicial complex and signals. Given a finite set of vertices
V, a k-simplex Hkis a subset of Vwith cardinality k+ 1. A
face of Hkis a subset with cardinality kand thus a k-simplex
1
arXiv:2210.05490v1 [eess.SP] 11 Oct 2022