A Step Towards Uncovering The Structure of Multistable Neural Networks Magnus Tournoy12andBrent Doiron12

2025-04-27 0 0 875.92KB 34 页 10玖币
侵权投诉
A Step Towards Uncovering The Structure of
Multistable Neural Networks
Magnus Tournoy1,2and Brent Doiron1,2
1Departments of Neurobiology and Statistics, University of Chicago, Chicago, IL, USA
2Grossman Center for Quantitative Biology and Human Behavior, University of Chicago,
Chicago, IL, USA
Abstract
We study how the connectivity within a recurrent neural network determines and is determined by
the multistable solutions of network activity. To gain analytic tractability we let neural activation be
a non-smooth Heaviside step function. This nonlinearity partitions the phase space into regions with
different, yet linear dynamics. In each region either a stable equilibrium state exists, or network
activity flows to outside of the region. The stable states are identified by their semipositivity
constraints on the synaptic weight matrix. The restrictions can be separated by their effects on the
signs or the strengths of the connections. Exact results on network topology, sign stability, weight
matrix factorization, pattern completion and pattern coupling are derived and proven. Our work
may lay the foundation for multistability in more complex recurrent neural networks.
e-mails: tournoy@uchicago.edu, bdoiron@uchicago.edu
arXiv:2210.03241v2 [cs.NE] 7 Mar 2023
1 Introduction
With experimental advances in the simultaneous recording of large populations of neurons
[1, 2], the mathematical understanding of high dimensional nonlinear neural networks is
of increasing interest [3, 4]. Ideally one would like to be able to relate the dynamical to
the structural properties of the network and vice versa. However these relations are easily
obscured by the complexity present in many biologically realistic network models. Models
that are tractable, simple and yet rich in their dynamical scope offer frameworks in which
the structure to function interdependence can be exactly formulated. They allow us to
understand the limits/possibilities of network dynamics in more general systems.
In this article we focus on the existence of stable equilibrium solutions and the associ-
ated constraints on the structure of network connectivity. The presence of multiple stable
equilibria, which can be thought of as stored/memorized patterns, critically depends on the
nonlinear activation function which maps neuronal inputs to outputs [5–8]. In our recurrent
neural circuit model we simplify the analysis by setting the neural activation function to be
a Heaviside step function. In this infinite gain limit, the continuous-time Hopfield model [9]
and related circuit models [10–12] become what are known as Glass networks [13]. These
were originally developed by Glass [14] to study Boolean networks with discrete switching
dynamics, and in later years have been used to model both networks of neurons [15,16] and
genes [17]. These studies have shown that, even though the nonlinear dynamics is restricted
to switching manifolds in phase space, the system exhibits complex dynamics such as steady
states, limit cycles and chaos [12, 16, 18]. This suggests that the model offers a computa-
tional advantage by allowing us to study the nonlinear effects in a discrete and local manner
without sacrificing dynamical richness/computational behavior.
Notwithstanding the step function being a natural limit of the smooth sigmoidal activa-
tion present in many models, the nonsmooth activation functions that have been predomi-
nantly studied in the context of stability are linear with a rectification [19], and hence lack
any saturation. In this case, the conditions for multistability, global stability in terms of
constraints on the symmetric weight matrix were derived by Hahnloser et al. [20]. These were
expanded upon by many others, leading to exact results on the perturbative, topological and
dynamical properties of threshold-linear networks (TLNs) [21–28]. Recently the structure
to function relation of the network was also explored by geometric analysis [29,30].
Nevertheless, Glass networks are a fundamental class of threshold activated networks,
in the sense that all the non-linear switching dynamics of the model class is fully captured
by the Glass network. They are the simplest choice to study the nonlinear properties of
continuous-time neural network dynamics while keeping the benefit of local linear behaviour.
In our work we study the impact of the network connectivity on the multistable character
of network solutions in Glass networks. Apart from a non-vanishing output constraint, no a
priori assumptions are made on the connection weights of the network. Our attempt is to
have a most general discussion.
2
The article is organized as follows:
In Section 2 we define the Glass network model and relevant mathematical objects. Next
are the main theorems which are ordered into three different parts.
In Section 3 we are concerned with the existence of stable steady states and the associ-
ated constraints on the weight matrix. We show that the presence of stable states imposes
semipositivity constraints on the synaptic weight matrix. The consequences hereof can be
divided into those that result from restrictions on the signs of the connections, i.e. the con-
figurations/topology of the network and those that restrict the weights, i.e. the competition
vs. cooperation between neurons.
In Section 4 we focus on the consequences of the stable state condition on the configu-
rations of the network. The matrix sign pattern classes that are necessary or sufficient for
the existence of stable states are given and their consequences for network configurations
are derived. The existence of sign stable states, i.e. states which are required to be stable
by the sign pattern, exemplifies how neural networks can achieve stability independent from
synaptic strengths and hence solely by their topology. Within the context of our network
we proof that sign stable states are always minimally stable, i.e. have no stable substates.
In Section 5 we cover three distinct algebraic properties of Glass networks with multiple
stable steady states: weight matrix factorization, stable state (de)composition and stable
state coupling.
In Subsection 5.1 we formulate how weight matrices of networks with stable states
are identified by a unique semipositive matrix factorization. This factorization is a
consequence of the geometrical properties of semipositive matrices, i.e. they are maps
between proper polyhedral cones [31].
In Subsection 5.2 we give the necessary and sufficient conditions the connection strengths
must satisfy in order for stable states to be (de)composable into stable (micro)macrostates.
These results are of importance for the pattern completing capabilities of the network.
In Subsection 5.3 we show that the (de-)composition theorems turn out to be derivable
from a more general state coupling theorem that is a consequence of the Boolean logical
structure of the Glass network. The theorem is of importance for the storage capacities
of the network.
In Section 6 we end by giving a conclusion.
3
2 Glass Networks
We define the nonlinear neural network1
˙xi=xi+Wijθxj.(2.1)
The index i= 1, . . . , n runs over neural units, e.g. individual neurons or neuronal assemblies,
that are parametrized by the variables xiR. The synaptic weights wij are quantified by the
matrix operator Wij. Throughout the paper we use Einstein notation meaning that indices
appearing both “up” and “down” imply a summation of the corresponding components. The
Heaviside step function
θxi=(1 for xi>0
0 for xi0.(2.2)
partitions the dynamics for every i[n] into two regions. The whole phase space therefore
becomes a set of 2ndynamically distinct orthants. From a set theoretic perspective the
Heaviside step function is selecting a subset α[n] for which the units are “active”. We
can hence use αas an index to construct the binary codes
pi
α(1 for iα
0 for i6∈ α.(2.3)
The parts of the partition themselves can then be defined in terms of these codes
Pαxi|θ(xi) = pi
α.(2.4)
The following two conditions will be imposed on the network:
Embedding. It is always possible to include an external input µto the system by
having it embedded as a feedforward unit in the network
W=wij µi
0 1 .(2.5)
The dynamics takes place on the hyperplane xn= 1. One can therefore restrict the
analysis to this subset of the phase space.
Constraint. We will assume that at any moment in time, the network is producing
some output, be it by synaptic or external activation. This requires that for
Vanishing external input:
Wijθ(xj)6= 0 xRn
+\ {0}(2.6)
1In Appendix A the equivalence of nonlinear inputs θxiand nonlinear outputs θWijxjin the context
of our work is explained.
4
Nonvanishing external input:
Wijθ(xj)6= 0 xRn
+|xn= 1.(2.7)
The nonzero output of the units will drive the system away from the boundaries be-
tween the parts. This is where some of the units are silent. As a consequence stable
states will lie within the interior of the parts. We still allow for the completely “inac-
tive” state, i.e. p=0, to settle in the origin in the case of vanishing external input. In
the case of an embedded external current the constraint is restricted to the hyperplane
xn= 1.
Example 1. Suppose we have a 2-dim network with vanishing external input µ=0and the
following values for the weight matrix
W=1 4
2 3.(2.8)
The quadrants of the 2-dim phase space are identified by the four index sets
,{1},{2},{1,2}.(2.9)
They are all accompanied by their respective codes
p=0
0,p{1}=1
0,p{2}=0
1,p{1,2}=1
1.(2.10)
Since that
Wp{1}=1
2, W p{2}=4
3, W p{1,2}=5
5(2.11)
are all nonzero, the constraint in (2.6) is satisfied.
Example 2. Suppose we have a 3-dim network with one of the units functioning as an
external input
W=
2 0 1
0 2 1
0 0 1
.(2.12)
Because the third unit is active but fixed by the external source, the dynamics is constrained
to the orthants where x3>0. These parts correspond to the index sets
{3},{1,3},{2,3},{1,2,3}(2.13)
with codes
p{3}=
0
0
1
,p{1,3}=
1
0
1
,p{2,3}=
0
1
1
,p{1,2,3}=
1
1
1
.(2.14)
Again one can easily check that the constraint in (2.7) is satisfied
Wp{3}=
1
1
1
, W p{1,3}=
1
1
1
, W p{2,3}=
1
1
1
, W p{1,2,3}=
1
1
1
.(2.15)
5
摘要:

AStepTowardsUncoveringTheStructureofMultistableNeuralNetworksMagnusTournoy1;2andBrentDoiron1;21DepartmentsofNeurobiologyandStatistics,UniversityofChicago,Chicago,IL,USA2GrossmanCenterforQuantitativeBiologyandHumanBehavior,UniversityofChicago,Chicago,IL,USAAbstractWestudyhowtheconnectivitywithinarecu...

展开>> 收起<<
A Step Towards Uncovering The Structure of Multistable Neural Networks Magnus Tournoy12andBrent Doiron12.pdf

共34页,预览5页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!

相关推荐

分类:图书资源 价格:10玖币 属性:34 页 大小:875.92KB 格式:PDF 时间:2025-04-27

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 34
客服
关注