1 Introduction
With experimental advances in the simultaneous recording of large populations of neurons
[1, 2], the mathematical understanding of high dimensional nonlinear neural networks is
of increasing interest [3, 4]. Ideally one would like to be able to relate the dynamical to
the structural properties of the network and vice versa. However these relations are easily
obscured by the complexity present in many biologically realistic network models. Models
that are tractable, simple and yet rich in their dynamical scope offer frameworks in which
the structure to function interdependence can be exactly formulated. They allow us to
understand the limits/possibilities of network dynamics in more general systems.
In this article we focus on the existence of stable equilibrium solutions and the associ-
ated constraints on the structure of network connectivity. The presence of multiple stable
equilibria, which can be thought of as stored/memorized patterns, critically depends on the
nonlinear activation function which maps neuronal inputs to outputs [5–8]. In our recurrent
neural circuit model we simplify the analysis by setting the neural activation function to be
a Heaviside step function. In this infinite gain limit, the continuous-time Hopfield model [9]
and related circuit models [10–12] become what are known as Glass networks [13]. These
were originally developed by Glass [14] to study Boolean networks with discrete switching
dynamics, and in later years have been used to model both networks of neurons [15,16] and
genes [17]. These studies have shown that, even though the nonlinear dynamics is restricted
to switching manifolds in phase space, the system exhibits complex dynamics such as steady
states, limit cycles and chaos [12, 16, 18]. This suggests that the model offers a computa-
tional advantage by allowing us to study the nonlinear effects in a discrete and local manner
without sacrificing dynamical richness/computational behavior.
Notwithstanding the step function being a natural limit of the smooth sigmoidal activa-
tion present in many models, the nonsmooth activation functions that have been predomi-
nantly studied in the context of stability are linear with a rectification [19], and hence lack
any saturation. In this case, the conditions for multistability, global stability in terms of
constraints on the symmetric weight matrix were derived by Hahnloser et al. [20]. These were
expanded upon by many others, leading to exact results on the perturbative, topological and
dynamical properties of threshold-linear networks (TLNs) [21–28]. Recently the structure
to function relation of the network was also explored by geometric analysis [29,30].
Nevertheless, Glass networks are a fundamental class of threshold activated networks,
in the sense that all the non-linear switching dynamics of the model class is fully captured
by the Glass network. They are the simplest choice to study the nonlinear properties of
continuous-time neural network dynamics while keeping the benefit of local linear behaviour.
In our work we study the impact of the network connectivity on the multistable character
of network solutions in Glass networks. Apart from a non-vanishing output constraint, no a
priori assumptions are made on the connection weights of the network. Our attempt is to
have a most general discussion.
2