Convex and Nonconvex Sublinear Regression with Application to Data-driven Learning of Reach Sets Shadi Haddad and Abhishek Halder

2025-05-06 0 0 1.42MB 6 页 10玖币
侵权投诉
Convex and Nonconvex Sublinear Regression with Application to
Data-driven Learning of Reach Sets
Shadi Haddad and Abhishek Halder
Abstract We consider estimating a compact set from finite
data by approximating the support function of that set via
sublinear regression. Support functions uniquely characterize
a compact set up to closure of convexification, and are sub-
linear (convex as well as positive homogeneous of degree one).
Conversely, any sublinear function is the support function of a
compact set. We leverage this property to transcribe the task of
learning a compact set to that of learning its support function.
We propose two algorithms to perform the sublinear regression,
one via convex and another via nonconvex programming. The
convex programming approach involves solving a quadratic
program (QP). The nonconvex programming approach involves
training a input sublinear neural network. We illustrate the
proposed methods via numerical examples on learning the
reach sets of controlled dynamics subject to set-valued input
uncertainties from trajectory data.
I. INTRODUCTION
Motivated by the correspondence between compact sets
and their support functions, in this work, we consider com-
putationally learning compact sets by performing regression
on their support functions from finite data. Our main idea
is to algorithmically leverage the isomorphism between the
support functions and the space of sublinear functions – a
subclass of convex functions.
Several works in the optimization, learning and statistics
literature [1]–[6] have investigated the problem of estimating
compact sets up to convexification from experimentally mea-
sured or numerically simulated (possibly noisy) data. While
the problem is of interest across a broad range of appli-
cations (e.g., obstacle detection from range measurements,
non-intrusive fault detection in materials and manufacturing
applications, tomographic imaging in medical applications),
we are primarily motivated in data-driven learning of reach
sets for safety-critical systems-control applications.
Formally, the (forward in time) reach set Xtis defined as
the set of states a controlled dynamical system may reach
at a given time t > 0subject to a controlled deterministic
dynamics ˙
x=f(t, x,u)where the state vector xRd,
the feasible input u(t)compact U Rm, and the initial
condition x(t= 0) compact X0Rd. Specifically,
Xt:= [
measurable u(·)∈U
{x(t)Rd|˙
x=f(t, x,u),x(t= 0) ∈ X0,
u(τ)∈ U for all 0τt}.(1)
With the aforesaid assumptions on X0,Uin place, we sup-
pose that the vector field fis sufficiently smooth to guarantee
Shadi Haddad and Abhishek Halder are with the Department of Ap-
plied Mathematics, University of California, Santa Cruz, CA 95064, USA,
{shhaddad,halder}@ucsc.edu.
compactness of Xtfor all t0.
In the systems-control literature, there exists a vast body
of works (see e.g., [7]–[11] as representative references) on
reach set computation. Interests behind approximating these
sets stem from the fact that their separation or intersection
often imply safety or the lack of it.
While numerous algorithms and toolboxes exist for ap-
proximating the reach sets, the computational approaches
differ depending on the representation of the set approxi-
mants. In other words, different approaches have different
interpretations of what does it mean to approximate a set.
For example, parametric approximants seek for a simple
geometric primitive (e.g., ellipsoid [12]–[16], zonotope [17]–
[19] etc.) to serve as a proxy for the set. On the other hand,
level set approximants [20], [21] seek for approximating a
(value) function whose zero sub-level set is the reach set.
More recent works have specifically advocated the data-
driven learning of reach sets by foregoing models and only
assuming access to (numerically or experimentally available)
data. These works have also proposed geometric [22], [23]
and functional primitives [24], [25] as representations. To the
best of the authors’ knowledge, using the support function
as data-driven learning representation for reach sets, as
proposed here, is new.
Contributions: Our specific contributions are twofold.
(i) We propose learning a compact set by learning its support
function from the (possibly noisy) elements of that set
available as finite data. We argue that support function as a
learning representation for compact sets is computationally
beneficial since several set operations of practical interest
have exact functional analogues of operations on correspond-
ing support functions.
(ii) We present two algorithms (Sec. III) to learn the
support function via sublinear regression: convex quadratic
programming, and training an input sublinear neural network
that involves nonconvex programming. We demonstrate the
comparative performance of these algorithms via numerical
examples (Sec. IV) on learning reach sets of controlled
dynamics subject to set-valued uncertainties from trajectory
data that may be available from simulation or experiments.
This work is organized as follows. In Sec. II, we provide
the necessary background on support functions of compact
sets, and their correspondence with sublinear functions. Sec.
III details the proposed sublinear regression framework using
two approaches, viz. solving QP, and training an input
sublinear neural network. Sec. IV exemplifies the proposed
framework using two numerical case studies on data-driven
learning of reach sets. Sec. V concludes the paper.
arXiv:2210.01919v2 [eess.SY] 23 Mar 2023
Notations: We denote the ddimensional unit sphere {x
Rd| kxk2= 1}as Sd1, and the standard Euclidean
inner product as ,·i. The convex hull and the closure of
a set Xare denoted as conv(X)and X, respectively. The
Legendre-Fenchel conjugate of function f:X 7→ Ris
f(·) := supx∈X {h·,xif(x)}. The conjugate fis convex
whether or not fis. The rectifier linear unit (ReLU) function
ReLU(·) := max,0}is defined element-wise. For any
natural number n, the finite set JnK:= {1,2, . . . , n}.
II. BACKGROUND
A. Support Functions
For a non-empty set X Rd, its support function hX:
Sd17→ R∪ {+∞} is defined as
hX(y) := sup
x∈X hy,xi,ySd1.(2)
Being pointwise supremum, hX(·)is a convex function in its
argument, and is finite if and only if [26, Chap. C.2, Prop.
2.1.3] Xis bounded. From (2), it also follows that the support
function remains invariant under closure of convexification,
i.e.,
hX(·) = hconv(X)(·).
The Legendre-Fenchel conjugate of the support function
hX(·)is the indicator function
1X(x) := (0if x∈ X,
+otherwise,(3)
see e.g., [27, Thm. 13.2]. This allows thinking hX(·)as a
functional proxy for the set X, i.e., the support function
uniquely determines a compact set up to closure of convex-
ification.
Furthermore, (2) has a clear geometric interpretation:
hX(y)quantifies the signed distance of the supporting hyper-
plane of compact Xwith outer normal vector y, measured
from the origin. This distance is negative if and only if
ySd1points into the open halfspace containing origin.
The convergence of compact sets w.r.t. the topology in-
duced by the (two-sided) Hausdorff metric
δH(P,Q) := max sup
p∈P
inf
q∈Q kpqk2,sup
q∈Q
inf
p∈P kpqk2
(4)
for compact P,Q, is equivalent to the convergence of the
corresponding support functions. Specifically, a sequence of
compact convex sets {Xi}iNconverges to a compact convex
set X(denoted as Xi→ X) in the Hausdorff topology if
and only if hXi(·)hX(·)pointwise. For compact convex
P,Q, the Hausdorff metric (4) can be expressed in terms of
the respective support functions:
δH(P,Q) = sup
ySd1|hP(y)hQ(y)|.(5)
Conveniently, operations on sets can be seen as operations
on corresponding support functions. For instance,
(i) x/conv (X)if and only if ySd1such that hy,xi>
hX(y),
subadditive sublinear convex
positive
homogeneous
Fig. 1: Euler diagram showing the relationships among the class of
convex, positive homogeneous, subadditive and sublinear functions.
(ii) X1⊆ X2if and only if hX1(y)hX2(y),
(iii) hX1+...+Xr(y) = hX1(y) + . . . +hXr(y),
(iv) hr
i=1 Xi(y) = max{hX1(y), . . . , hXr(y)},
(v) hr
i=1 Xi(y) = inf
y1+...+yr=y{hX1(y1) + . . . +hXr(yr)},
(vi) hAX+b(y) = hy,bi+hXA>y,ARd×d,bRd.
These correspondence motivate the possibility of using the
support functions as computational learning representations
for estimating compact sets from data.
We next point out that the support functions have addi-
tional structural properties beyond convexity which will be
important in our algorithmic development.
B. Sublinear Functions
A function ψ:Rd7→ R∪ {+∞} is called sublinear if
it is convex and positive homogeneous of degree one. The
latter condition means that
ψ(ax) = (x)a > 0.
Alternatively, ψ:Rd7→ R∪ {+∞} is sublinear if and only
if its epigraph is a nonempty convex cone in Rd×R, see
e.g., [26, Chap. C.1, Prop. 1.1.3].
From (2), the support function hX(·)is both convex and
positive homogeneous of degree one, and hence a sublinear
function. Conversely, any sublinear function can be viewed
as support function of a compact set. The converse follows
from the fact [28, Thm. 8.13] that a positive homogeneous
convex function can be expressed as pointwise supremum of
linear function. Thus, to learn a compact set is to learn its
support function, and learning a support function from data
leads to sublinear regression as opposed to the well-known
convex regression [29], [30].
Any sublinear function (and hence the support function)
must also be subadditive, i.e.,
ψ(y+z)ψ(y) + ψ(z)y,zRd.(6)
This can be seen as a joint consequence of convexity and
positive homogeneity because specializing convexity to mid-
point convexity implies
ψ1
2y+1
2z1
2ψ(y) + 1
2ψ(z),
which upon using positive homogeneity yields (6).
Positive homogeneity and subadditivity together is equiv-
alent to sublinearity, which also brings convexity. Following
[26, Chap. C.1], an Euler diagram among these classes of
functions is shown in Fig. 1.
摘要:

ConvexandNonconvexSublinearRegressionwithApplicationtoData-drivenLearningofReachSetsShadiHaddadandAbhishekHalderAbstract—Weconsiderestimatingacompactsetfromnitedatabyapproximatingthesupportfunctionofthatsetviasublinearregression.Supportfunctionsuniquelycharacterizeacompactsetuptoclosureofconvexica...

展开>> 收起<<
Convex and Nonconvex Sublinear Regression with Application to Data-driven Learning of Reach Sets Shadi Haddad and Abhishek Halder.pdf

共6页,预览2页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!
分类:图书资源 价格:10玖币 属性:6 页 大小:1.42MB 格式:PDF 时间:2025-05-06

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 6
客服
关注