TANGENT BUNDLE FILTERS AND NEURAL NETWORKS FROM MANIFOLDS TO CELLULAR SHEA VES AND BACK C. Battiloro12 Z. Wang1 H. Riess3 P . Di Lorenzo2 A. Ribeiro1

2025-05-02 0 0 489.35KB 9 页 10玖币
侵权投诉
TANGENT BUNDLE FILTERS AND NEURAL NETWORKS:
FROM MANIFOLDS TO CELLULAR SHEAVES AND BACK
C. Battiloro1,2, Z. Wang1, H. Riess3, P. Di Lorenzo2, A. Ribeiro1
1ESE Department, University of Pennsylvania, Philadelphia, USA
2DIET Department, Sapienza University of Rome, Rome, Italy
3ECE Department, Duke University, Durham, USA
E-mail: claudio.battiloro@uniroma1.it, zhiyangw@seas.upenn.edu
ABSTRACT
In this work we introduce a convolution operation over the tangent
bundle of Riemannian manifolds exploiting the Connection Lapla-
cian operator. We use the convolution to define tangent bundle fil-
ters and tangent bundle neural networks (TNNs), novel continuous
architectures operating on tangent bundle signals, i.e. vector fields
over manifolds. We discretize TNNs both in space and time do-
mains, showing that their discrete counterpart is a principled vari-
ant of the recently introduced Sheaf Neural Networks. We formally
prove that this discrete architecture converges to the underlying con-
tinuous TNN. We numerically evaluate the effectiveness of the pro-
posed architecture on a denoising task of a tangent vector field over
the unit 2-sphere.
Index TermsGeometric Deep Learning, Tangent Bundle Sig-
nal Processing, Tangent Bundle Neural Networks, Cellular Sheaves
1. INTRODUCTION
The success of deep learning is mostly the success of Convolutional
Neural Networks (CNNs) [1]. CNNs have achieved impressive per-
formance in a wide range of applications showing good generaliza-
tion ability. Based on shift operators in the space domain, one (but
not the only one) key attribute is that the convolutional filters sat-
isfy the property of shift equivariance. Nowadays, data defined on
irregular (non-Euclidean) domains are pervasive, with applications
ranging from detection and recommendation in social networks pro-
cessing [2], to resource allocations over wireless networks [3], or
point clouds for shape segmentation [4], just to name a few. For
this reason, the notions of shifts in CNNs have been adapted to
convolutional architectures on graphs (GNNs) [5, 6] as well as a
plethora of other structures, e.g. simplicial complexes [7–10], cell
complexes [11, 12], and manifolds [13]. In [14], a framework for al-
gebraic neural networks has been proposed exploiting commutative
algebras. In this work we focus on tangent bundles, a formal tool for
describing and processing vector fields on manifolds, which are key
elements in tasks such as robot navigation or flocking modeling.
Related Works. The renowned manifold assumption states that high
dimensional data examples are sampled from a low-dimensional
Riemannian manifold. This assumption is the fundamental block
of manifold learning, a class of methods for non-linear dimension-
ality reduction. Some of these methods approximate manifolds
with k-NN or geometric graphs via sampling points, i.e., for a fine
enough sampling resolution, the graph Laplacian of the approxi-
mating graph “converges” to the Laplace-Beltrami operator of the
manifold [15]. These techniques rely on the eigenvalues and eigen-
vectors of the graph Laplacian [16], and they give rise to a novel
perspective on manifold learning. In particular, the above approx-
imation leads to important transferability results of graph neural
networks (GNNs) [17,18], as well as to the introduction of Graphon
and Manifold Neural Networks, continuous architectures shown to
be limit objects of GNNs [19, 20]. However, most of the previ-
ous works focus on scalar signals, e.g. one or more scalar values
attached to each node of graphs or point of manifolds; recent devel-
opments [21] show that processing vector data defined on tangent
bundles of manifolds or discrete vector bundles [22, 23] comes with
a series of benefits. Moreover, the work in [24] proves that it is
possible to approximate both manifolds and their tangent bundles
with certain cellular sheaves obtained from a point cloud via k-NN
and Local PCA, such that, for a fine enough sampling resolution,
the Sheaf Laplacian of the approximating sheaf “converges” to the
Connection Laplacian operator. Finally, the work in [25] generalizes
the result of [24] by proving the spectral convergence of a large class
of Laplacian operators via the Principal Bundle set up.
Contributions. In this work we define a convolution operation over
the tangent bundles of Riemannian manifolds with the Connection
Laplacian operator. Our definition is consistent, i.e. it reduces to
manifold convolution [19] in the one-dimensional bundle case, and
to the standard convolution if the manifold is the real line. We intro-
duce tangent bundle convolutional filters to process tangent bundle
signals (i.e. vector fields over manifolds), we define a frequency
representation for them and, by cascading layers consisting of tan-
gent bundle filters banks and nonlinearities, we introduce Tangent
Bundle Neural Networks (TNNs). We then discretize the TNNs in
the space domain by sampling points on the manifold and build-
ing a cellular sheaf [26] representing a legit approximation of both
the manifold and its tangent bundle [24]. We formally prove that
the discretized architecture over the cellular sheaf converges to the
underlying TNN as the number of sampled points increases. More-
over, we further discretize the architecture in the time domain by
sampling the filter impulse function in discrete and finite time steps,
showing that space-time discretized TNNs are a principled variant
of the very recently introduced Sheaf Neural Networks [23, 27, 28],
discrete architectures operating on cellular sheaves and generalizing
graph neural networks. Finally, we numerically evaluate the perfor-
mance of TNNs on a denoising task of a tangent vector field of the
unit 2-sphere.
Paper Outline. The paper is organized as follows. We start with
some preliminary concepts in Section 2. We define the tangent bun-
dle convolution and filters in Section 3, and Tangent Bundle Neural
Networks (TNNs) in Section 4. In Section 5, we discretize TNNs in
space and time domains, showing that discretized TNNs are Sheaf
Neural Networks and proving the convergence result. Numerical re-
sults are in Section 6 and conclusions are in Section 7.
arXiv:2210.15058v3 [eess.SP] 18 Nov 2022
2. PRELIMINARY DEFINITIONS
Manifolds and Tangent Bundles. We consider a compact and
smooth ddimensional manifold Misometrically embedded in Rp.
Each point x M is endowed with a ddimensional tangent (vec-
tor) space TxM
=Rd,v∈ TxMis said to be a tangent vector at
xand can be seen as the velocity vector of a curve over Mpassing
through the point x(formal definitions can be found in [29]). The
disjoint union of the tangent spaces is called the tangent bundle
T M =Fx∈M TxM. The embedding induces a Riemann structure
on M; in particular, it equips each tangent space TxMwith an inner
product, called Riemann metric, given, for each v,w∈ TxM, by
hv,wiTxM=iviw,(1)
where iv∈ TxRpis the embedding of v∈ TxMin TxRpRp
(the d-dimensional subspace of Rpwhich is the embedding of TxM
in Rp), with i:T M TxRpbeing an injective linear mapping
referred to as differential [29], and is the dot product. The Riemann
metric induces also a probability measure µover the manifold.
Tangent Bundle Signals. A tangent bundle signal is a vector field
over the manifold, thus a mapping F:M → T M that associates
to each point of the manifold a vector in the corresponding tangent
space. An inner product for tangent bundle signals Fand Gis
hF,GiT M =ZMhF(x),G(x)iTxMdµ(x),(2)
and the induced norm is ||F||2
T M =hF,FiT M . We denote with
L2(T M)the Hilbert Space of finite energy (w.r.t. ||·||T M ) tangent
bundle signals. In the following we denote ,·iT M with ,·i when
there is no risk of confusion.
Connection Laplacian. The Connection Laplacian is a (second-
order) operator ∆ : L2(T M)→ L2(T M), given by the trace of
the second covariant derivative defined (for this work) via the Levi-
Cita connection [24]. The connection Laplacian has some desir-
able properties: it is negative semidefinite, self-adjoint and elliptic.
The Connection Laplacian characterizes the heat diffusion equation
U(x, t)
t U(x, t) = 0,(3)
where U:M × R+
0 T M and U(·, t)∈ L2(T M)tR+
0
(see [21] for a simple interpretation of (3)). With initial condition set
as U(x, 0) = F(x), the solution of (3) is given by
U(x, t) = etF(x),(4)
which provides a way to construct tangent bundle convolution, as
explained in the following section. The Connection Laplacian has
a negative spectrum {−λi,φi}
i=1 with eigenvalues λiand corre-
sponding eigenvector fields φisatisfying
φi=λiφi,(5)
with 0< λ1λ2. . . . The λis and the φis can be interpreted as
the canonical frequencies and oscillation modes of T M.
3. TANGENT BUNDLE CONVOLUTIONAL FITLERS
In this section we define the tangent bundle convolution of a filter
impulse response e
hand a tangent bundle signal F.
Definition 1. (Tangent Bundle Filter) Let e
h:R+Rand let
F∈ L2(T M)be a tangent bundle signal. The manifold filter with
impulse response e
h, denoted with h, is given by
G(x) = hF(x) := e
h ?T M F=Z
0e
h(t)U(x, t)dt, (6)
where U(x, t)is the solution of the heat equation in (3) with
U(x, 0) = F(x). Injecting (4) in (6), we obtain
G(x) = hF(x) = Z
0e
h(t)etF(x)dt=h(∆)F(x).(7)
The convolution in Definition 1 is consistent, i.e. it generalizes the
manifold convolution [19] and the standard convolution in Euclidean
domains (see Appendix A.4). The frequency representation ˆ
Fof F
can be obtained by projecting Fonto the φis basis
ˆ
Fi=hF,φii=ZMhF(x),φi(x)iTxMdµ(x)(8)
Definition 2. (Bandlimited Tangent Bundle Signals) A tangent bun-
dle signal is said to be λM-bandlimited with λM>0if ˆ
Fi= 0
for all i such that λi> λM.
Proposition 1. Given a tangent bundle signal Fand a tangent bundle
filter h(∆) as in Definition 1, the frequency representation of the
filtered signal G=h(∆)Fis given by
ˆ
Gi=Z
0e
h(t)eidtˆ
Fi.(9)
Proof. See Appendix A.1.
Definition 3. (Frequency Response)The frequency response ˆ
h(λ)
of the filter h(∆) is defined as
ˆ
h(λ) = Z
0e
h(t)edt. (10)
This leads to ˆ
Gi=ˆ
h(λi)ˆ
Fi, meaning that the tangent bundle
filter is point-wise in the frequency domain. Therefore, we can write
the frequency representation of the tangent bundle filter as
G=h(∆)F=
X
i=1
ˆ
h(λi)hF,φiiφi.(11)
We note that the frequency response of the tangent bundle filter gen-
eralizes the frequency response of a standard time filter as well as a
graph filter [30].
4. TANGENT BUNDLE NEURAL NETWORKS
We define a layer of a Tangent Bundle Neural Network (TNN) as a
bank of tangent bundle filters followed by a pointwise non-linearity.
In this setting, pointwise informally means “pointwise in the ambi-
ent space”. We introduce the notion of differential-preserving non-
linearity to formalize this concept.
Definition 4. (Differential-preserving Non-Linearity) Denote with
Ux⊂ TxRpthe image of the injective differential iin x. A mapping
σ:L2(T M)→ L2(T M)is a differential-preserving non-linearity
if it can be written as σ(F(x)) = i1eσxiF(x), where eσx:Ux
Uxis a point-wise non-linearity in the usual (euclidean) sense.
Furthermore, we assume that eσx=eσfor all x∈ M. Thus, the
lth layer of a TNN with Flinput signals {Fq
l}Fl
q=1,Fl+1 output
signals {Fu
l+1}Fl+1
u=1 , and point-wise non linearity σ(·)is written as
Fu
l+1(x) = σ Fl
X
q=1
h(∆)u,q
lFq
l(x)!, u = 1, ..., Fl+1.(12)
A TNN of depth Lwith input signals {Fq}F0
q=1 is built as the stack
of Llayers defined in (12), where Fq
0=Fq. To globally represent
the TNN, we collect all the filter impulse responses in a function set
H=ˆ
hu,q
ll,u,q and we describe the TNN uth output as a map-
ping Fu
L=ΨuH,,{Fq}F0
q=1to enhance that it is parameterized
by filters Hand Connection Laplacian .
摘要:

TANGENTBUNDLEFILTERSANDNEURALNETWORKS:FROMMANIFOLDSTOCELLULARSHEAVESANDBACKC.Battiloro1;2,Z.Wang1,H.Riess3,P.DiLorenzo2,A.Ribeiro11ESEDepartment,UniversityofPennsylvania,Philadelphia,USA2DIETDepartment,SapienzaUniversityofRome,Rome,Italy3ECEDepartment,DukeUniversity,Durham,USAE-mail:claudio.battilor...

展开>> 收起<<
TANGENT BUNDLE FILTERS AND NEURAL NETWORKS FROM MANIFOLDS TO CELLULAR SHEA VES AND BACK C. Battiloro12 Z. Wang1 H. Riess3 P . Di Lorenzo2 A. Ribeiro1.pdf

共9页,预览2页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!
分类:图书资源 价格:10玖币 属性:9 页 大小:489.35KB 格式:PDF 时间:2025-05-02

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 9
客服
关注