Channel Joint Attention (TCJA) to concurrently process in-
put in both temporal and spatial dimensions, which is a sig-
nificant effort for SNNs’ spatio-temporal feature extraction.
These studies effectively improve the performance of SNNs
by transplanting established ANNs’ modules and method-
ologies. However, applying these computational modules to
SNNs from the standpoint of deep learning dilutes the fun-
damental biological interpretability, bringing SNNs closer
to a mix of existing concepts in machine learning, such as
recurrent neural networks (RNNs), binary neural networks
(BNNs), and quantization networks.
From a biological standpoint, some works focus on the
synapse models, investigating the potential of SNNs in re-
spect of connection modes and information transmission.
(Shrestha and Orchard 2018; Fang et al. 2020a; Yu et al.
2022) integrate impulse response models with synaptic dy-
namics, hence enhancing the temporal information represen-
tation of SNNs; (Cheng et al. 2020) implements intra-layer
lateral inhibitory connections to improve the noise tolerance
of SNNs; from the standpoint of synaptic plasticity, (Bel-
lec et al. 2020; Zhang and Li 2019) introduce bio-plausible
training algorithms as an alternative to back-propagation
(BP), allowing for lower-power training. Experiments re-
vealed that the synaptic models of SNNs have a great deal
of space for modification and refinement in order to handle
spatio-temporal data better (Fang et al. 2020a). We propose
a Spatio-Temporal Synaptic Connection (STSC) module for
this reason.
Based on the notion of spatio-temporal receptive fields,
the structural features of dendritic branches (Letellier et al.
2019) and feedforward lateral inhibition (Luo 2021) moti-
vate this study. By merging the ANNs’ computation mod-
ules (temporal convolutions and attention mechanisms) with
SNNs, we propose the STSC module, consisting of Tem-
poral Response Filter (TRF) module and Feedforward Lat-
eral Inhibition (FLI) module. As shown in Fig. 1, the STSC
can be attached to spatial operations to expand the spatio-
temporal receptive fields of synaptic connections, hence
facilitating the extraction of spatio-temporal features. The
main contributions of this work are summarized as follows:
• We propose STSC-SNN to implement synaptic connec-
tions with extra temporal dependencies and enhance the
SNNs’ capacity to handle temporal information. To the
best of our knowledge, this study is the first to propose
the idea of synaptic connections with spatio-temporal re-
ceptive fields in SNNs and to investigate the influence of
synaptic temporal dependencies in SNNs.
• Inspired by biological synapses, we propose two plug-
and-play blocks: Temporal Response Filter (TRF) and
Feedforward Lateral Inhibition (FLI), which perform
temporal convolution and attention operations and can
be simply implemented into deep learning frameworks
for performance improvements.
• On neuromorphic datasets, DVS128 Gesture, SHD, N-
MNIST, CIFAR10-DVS, we have produced positive re-
sults. Specifically, we acquire 92.36% accuracy on SHD
with a simple fully-connected structure, which is a great
improvement above the 91.08% results obtained with re-
current structure and reaches performance comparable to
ANNs.
Related Work
Learning algorithms for SNNs
In recent years, many works have explored the learning al-
gorithms of SNNs, which can be generally categorized as bi-
ologically inspired approaches (Diehl and Cook 2015; Bel-
lec et al. 2020; Zhang and Li 2019), ANN-to-SNN conver-
sion methods (Orchard et al. 2015; Sengupta et al. 2019;
Han, Srinivasan, and Roy 2020), and surrogate-based di-
rect training methods (Wu et al. 2018; Neftci, Mostafa,
and Zenke 2019; Fang et al. 2021b). Direct training meth-
ods utilize surrogate gradients to tackle the issue of non-
differentiable spike activity (Wu et al. 2018), allowing error
back-propagation (BP) through time to interface the gradi-
ent descent directly on SNNs for training. Those BP-based
methods show strong potential to achieve high accuracy in
a few timesteps by making full use of spatio-temporal in-
formation(Wu et al. 2019; Fang et al. 2021b). However,
more research is required to determine how to better extract
spatio-temporal features for enhanced processing of spatio-
temporal data; this is what we want to contribute.
Attention Modules in SNNs
The attention mechanism distributes attention preferentially
to the most informative input components, which could
be interpreted as the sensitivity of various inputs. The SE
block (Hu, Shen, and Sun 2018) offers an efficient atten-
tion approach to improve representations in ANNs. (Xie
et al. 2016; Kundu et al. 2021) introduced spatial-wise at-
tention in SNNs; then, TA-SNN (Yao et al. 2021) devel-
oped a temporal-wise attention mechanism in SNNs by as-
signing attention factors to each input frame; more subse-
quently, TCJA (Zhu et al. 2022) added a channel-wise atten-
tion module and proposed temporal-channel joint attention.
These studies demonstrate the usefulness of attention mech-
anisms in SNNs by achieving state-of-the-art results on var-
ious datasets. Moreover, based on these investigations, it is
desirable to study other correlations between the attention
mechanism and the biological nature of SNNs, which is the
objective of our research. We employ the attention module
as a feedforward lateral inhibitory connection (Luo 2021),
which develops a gating mechanism for the synapse model,
and enables nonlinear computation by the synapse.
Synaptic Models in SNNs
As one of the fundamental components of SNN, the synap-
tic model has drawn the interest of several researchers.
(Shrestha and Orchard 2018; Fang et al. 2020a; Yu et al.
2022) established temporal relationships between response
post-synaptic currents and input pre-synaptic spikes, there-
fore improving temporal expressiveness. Those temporal re-
lationships are the extension of fully-connected synapses
which are based on the assumption that there is only one
connection between two neurons. Nevertheless, synaptic
connections are often complex, and there are typically many
paths connecting the axons and dendrites of neurons (Luo