Probability computation for highdimensional semilinear SDEs driven by isotropic -stable processes via mild Kolmogorov equations

2025-05-02 0 0 916.38KB 29 页 10玖币
侵权投诉
Probability computation for high–dimensional semilinear SDEs driven by
isotropic αstable processes via mild Kolmogorov equations
Alessandro Bondi
Abstract
Semilinear, Ndimensional stochastic differential equations (SDEs) driven by additive Lévy noise are inves-
tigated. Specifically, given α1
2,1,the interest is on SDEs driven by 2αstable, rotation–invariant pro-
cesses obtained by subordination of a Brownian motion. An original connection between the time–dependent
Markov transition semigroup associated with their solutions and Kolmogorov backward equations in mild
integral form is established via regularization–by–noise techniques. Such a link is the starting point for
an iterative method which allows to approximate probabilities related to the SDEs with a single batch of
Monte Carlo simulations as several parameters change, bringing a compelling computational advantage over
the standard Monte Carlo approach. This method also pertains to the numerical computation of solutions
to high–dimensional integro–differential Kolmogorov backward equations. The scheme, and in particular
the first order approximation it provides, is then applied for two nonlinear vector fields and shown to offer
satisfactory results in dimension N= 100.
Keywords: Kolmogorov equations, Semilinear SDEs, Iterative scheme, Isotropic αstable Lévy processes.
MSC2020: 60G52, 60H50, 65C20, 45K05, 47D07.
1. Introduction
In this paper, we are concerned with the study of quantities related to the Ndimensional, semilinear
stochastic differential equation (SDE)
(dXt= (AXt+B0(t, Xt)) dt +Q dWLt, t [s, T ],
Xs=xRN,(1)
with a specific interest in the case Nhigh. Here, given α1
2,1,Lis an αstable subordinator (i.e., an
increasing Lévy process) independent from (βn)n=1,...,N , which in turn are independent Brownian motions;
we write W=β1, . . . , βN>. All these processes are defined in a common complete probability space
(Ω,F,P),which we endow with the minimal augmented filtration generated by the subordinated Brownian
motion WL. Moreover, T > 0is a finite time horizon and s[0, T ]is the initial time. As for A, Q RN×N,
they are diagonal matrices with Anegative–definite and Qpositive–definite. For our numerical experiments
we will consider Q=σ2Id,being Id RN×Nthe identity matrix, so that σ > 0is a parameter describing
the strength of the noise. Finally, the nonlinear bounded vector field B0: [0, T ]×RNRNis subject to
suitable regularity conditions which will be specified in the sequel and guarantee, among other things, the
existence of a pathwise unique solution of (1): it will be denoted by Xs,x = (Xs,x
t)t[s,T ].
Connected to the SDE in (1), we have the following Kolmogorov backward equation:
su(s, x) = Ax +B0(s, x),>u(s, x)
RRNus, x +Qzu(s, x)1D(z)u(s, x)Qzν(dz), s [0, t),
u(t, x) = φ(x), x RN,
(2)
Classe di Scienze, Scuola Normale Superiore di Pisa, 56126 Pisa, Italy. Email: alessandro.bondi@sns.it
arXiv:2210.03004v1 [math.PR] 6 Oct 2022
where φ:RNR,D=zRN,|z| ≤ 1is the closed unit ball and we fix t[0, T ]. Here ν(dz)is the
Lévy measure of WL, and up to a positive multiplicative constant is of the form ν(dz) = |z|(N+2α)dz (see,
e.g., [20, Theorem 30.1]). The link between the equations in (1) and (2) is provided by Theorem 7 (ii)below
(see also the book [15] for related results), where it is shown that the time–dependent Markov transition
semigroup E[φ(Xs,x
t)] associated with (1) satisfies (2) in the closed interval [0, t]for every φC3
bRN.
Moreover, we are able to extend the validity of this connection in [0, t)to every function φ∈ BbRN
through an original procedure based on regularization–by–noise and a mild, integral formulation of (2) (see
Remark 1).
In the present work, we are precisely interested in these expected values, with particular attention to
the case φ(x)=1{|x|>R}(for some threshold R > 0), where one has E[φ(Xs,x
t)] = P(|Xs,x
t|> R). Hence
we want to describe a method which allows to compute probabilities related to the solution of the SDE
(1). Trying to get an estimate of them by numerically solving the integro–differential equation (2) is a
typical example of curse of dimensionality (CoD), and since we intend to deal with a high dimension (in the
simulations we take N= 100), this is an unfeasible way to proceed. The canonical approach to tackle our
problem is the Monte Carlo method: several paths of Xs,x are simulated by the Euler–Maruyama scheme
with a fine time step, and then the final points of these trajectories are averaged to get an approximation of
the desired expected values by virtue of the strong law of large numbers. However, if we were to follow this
scheme (which is known to be free of the CoD), then we would have to start over the procedure every time
we change the starting point xand the starting time s, the noise strength σand even the nonlinearity B0, a
practice that is very common in a wide range of applications including weather forecasts and calibration of
financial models (see [1] and references therein). In order to overcome this setback, we aim to extend to our
framework the ideas developed in the papers [9, 10] for the Gaussian case, namely we search for an iterative
scheme which relies on a single bulk of Monte Carlo simulations independent from the aforementioned
parameters. Specifically, to approximate the value of the iterates vn
s(t, x), n N∪ {0},we just need
to simulate once and for all, using the Euler–Maruyama scheme, a large number of sample paths of the
subordinator Land of the stochastic convolution e
Z0
t=Rt
0e(tr)AdWLr, t [0, T ], which is the unique (up
to indistinguishability) solution of the linear SDE
de
Z0
t=Ae
Z0
tdt +dWLt,e
Z0
0= 0.
The main novelty of the approach that we propose consists in the structure of the noise WL, which is a
2αstable, rotation–invariant Lévy process (cfr. [20, Example 30.6]). In particular, the introduction of L
considerably complicates the framework compared to the Brownian one treated in [9, 10]. This fact leads us
to develop an original procedure –essentially based on conditioning with respect to the σalgebra generated
by the subordinator– to get an expression for the iterates which is suitable for applications. Moreover, the
theoretical foundation of the iterative method analyzed in this work, Theorem 3, has a remarkable interest
on its own. Indeed, it establishes a connection between the time–dependent Markov transition semigroup
associated with (1) and a mild, integral formulation of (2) (see Equation (11)) that, at the best of our
knowledge, is new when it comes to isotropic Lévy processes.
The paper is structured as follows. Section 2 describes the setting and recalls the main concepts that
will be widely used in the rest of the paper. In addition, it introduces the integral formulation of the
Kolmogorov equation (2) and shows its well–posedness. Next, in Section 3 (see Theorem 3) we provide
the probabilistic interpretation of (2) in mild form, along with other interesting regularization–by–noise
results for SDEs driven by subordinated Wiener processes. In Section 4 we define the iterative scheme
and prove its convergence to the expected values that we are trying to approximate. Next, Section 5 is
concerned with the computation of the first iterate v1
s(t, x); it is divided into two subsections referring to
the deterministic and random time–shifts, respectively. Its results are used in Section 6 as the base case
for the induction argument that allows to calculate vn
s(t, x)(see Theorem 17). The last part (Section 7) is
devoted to numerical experiments in dimension N= 100 for two choices of the nonlinear vector field B0,
with particular attention on the improvements provided by the first iteration over the linear approximation
corresponding to the Ornstein–Uhlenbeck (hereafter OU) processes. Finally, Appendix A contains the proof
of Lemma 4.
2
Notation: Let d, m, n N. In this paper, elements of Rdare columns vectors. For any u, v Rd,
we denote by |u|the Euclidean norm and by hu, vi=u>vthe standard scalar product. For a matrix
ARd×m,|A|= supxRm:|x|=1 |Ax|is the operator norm. Given a vector field B:RdRm×n, the
uniform norm is kBk= supxRd|B(x)|. In particular, if n= 1 then the Jacobian matrix is denoted by
DB Rm×d, and DhB=DBh, h Rd; if also m= 1 (so that Bis a scalar function) then the gradient
Bis a row vector and D2BRd×drepresents the Hessian matrix. For an integer kN∪ {0}, the
space Ck
bRd;Rm×nis constituted by the continuous vector fields Bwhich are bounded, continuously
differentiable up to order kwith bounded derivatives. Taken h= 1, . . . , k and BCk
bRd;Rm×n, we
write
hB
= supi,j,hkhBi,j k, where B= (Bi,j ), i = 1, . . . , m, j = 1, . . . , n and h(N∪ {0})dis a
multi–index with length khk1=h.
2. Preliminaries and Kolmogorov backward equation in mild form
Fix NNand a complete probability space (Ω,F,P). Consider Nindependent Brownian motions
(βn)n=1,...,N : we write W=β1, . . . , βN>. Moreover, for α(0,1) we take a strictly αstable subordinator
L= (Lt)t0independent from (βn)n, and denote by FLthe augmented σalgebra it generates, i.e., FL=
σFL
0∪ N, where FL
0is the natural σalgebra generated by Land Nis the family of Fnegligible sets.
In other words, Lis an increasing Lévy process with (cfr. [20, Example 24.12])
EeiuL1= exp n¯γα|u|α1itan πα
2sign uo, u R,for some ¯γ > 0.(3)
Let us introduce the diagonal matrices A=diag [λ1, . . . , λN]and Q=diag σ2
1, . . . , σ2
N, with 0< λ1
··· ≤ λNand σ2
n>0, n = 1, . . . , N. We endow with the minimal augmented filtration F= (Ft)t0
generated by WL, which means Ft=σFWL
0,t ∪ Nfor t0, with FWL
0,t t0being the natural filtration
of WL.
Given T > 0and a continuous function f: [0, T ]RN, if xRNand 0s < T then Zs,x =
(Zs,x
t)t[s,T ]is the OU process starting from xat time s, i.e., it is the unique solution of the next linear SDE
dZs,x
t= (AZs,x
t+f(t)) dt +pQ dWLt, Zs,x
s=x. (4)
We denote by R= (Rs,t),0stT , the time–dependent, Markov transition semigroup associated with
this family of processes:
Rs,tφ=EφZs,·
t,0st < T, φ ∈ BbRd,(5)
where BbRNdenotes the space of real–valued, Borel measurable and bounded functions defined on RN.
The Chapman–Kolmogorov equations ensure that
Rs,t (Rt,uφ) = Rs,uφ, 0s<t<uT, φ ∈ BbRN.(6)
For every 0s < t Twe define Fs,t =Rt
se(tr)Af(r)dr RNand IL
s,t =Rt
se2(tr)AQ dLr: Ω
RN×N.An adaptation of [5, Theorem 6] guarantees that, for every φ∈ BbRN, the function Rs,tφis
differentiable at any point xRNin every direction hRN, with
>Rs,tφ(x), h=Ehφ(Zs,x
t)DIL
s,t1e(ts)Ah, Zs,x
te(ts)AxFs,tEi.(7)
Moreover, Rs,tφC1
bRNand the following gradient estimate holds true for some constant cα>0:
>Rs,tφ
cαkφksup
n=1,...,N 1
σn
2α
r2αλn
1e2αλn(ts)eλn(ts)!,0s < t T. (8)
3
In the sequel, for every xRNand t(0, T ]we are going to need the continuity of R·,tφ(x)in the interval
[0, t)[resp., in the closed interval [0, t]] when φ∈ BbRN[resp., φCbRN]. In order to prove this
property, we first note that a variation of constants formula lets us consider (from (4))
Zs,x
t=e(ts)Ax+Zt
s
e(tr)Af(r)dr +Zt
s
e(tr)ApQ dWLr,0stT, x RN.(9)
This expression shows that the process (Zs,x
t)s[0,t]is stochastically continuous (in the variable s). As a
consequence, if φCbRN, then we can easily deduce the continuity of R·,tφ(x)in [0, t]applying the
continuous mapping and Vitali’s convergence theorems to (5). In the general case φ∈ BbRN,one can
use the same argument combined with the regularizing property of Rand (6) to obtain the continuity of
R·,tφ(x)in [0, t), as desired. Finally, observe that there exists a constant C=C(α, A, Q)>0such that
cαsup
n=1,...,N 1
σn
2α
r2αλn
1e2αλn(ts)eλn(ts)!C1
(ts)1/(2α),0s < t T.
We refer to [5, Remark 5] for a similar computation. Let us assume α1
2,1: in this way, denoting by
γ= 1/(2α), we have γ(0,1) and the bound in (8) entails
>Rs,tφ
Ckφk
1
(ts)γ,0s < t T, φ ∈ BbRN.(10)
For a given measurable and bounded vector field B: [0, T ]×RNRN, we are concerned with the
analysis of the following Kolmogorov backward equation in mild, integral form:
uφ
s(t, x) = Rs,tφ(x) + Zt
s
Rs,r B(r, ·),>uφ
r(t, ·)(x)dr, s [0, t], x RN,(11)
where t(0, T ]and φ∈ BbRN. We denote by kBk0,T = sup0tTkB(t, ·)k. In order to study (11), for
every 0< t1< t2T, we consider the Banach space Λγ
1[t1, t2],k·kΛγ
1[t1,t2]defined by
Λγ
1[t1, t2] = nV: [t1, t2]×RNRmeasurable :V(·, x)C([t1, t2]) , x RN;
V(s, ·)C1
bRN, s [t1, t2] ; sup
s[t1,t2]
sγkV(s, ·)k1<o,
kVkΛγ
1[t1,t2]= sup
s[t1,t2]
sγkV(s, ·)k1,where kV(s, ·)k1=kV(s, ·)k+
1V(s, ·)
.
When t1= 0, we are careful to remove the left–end point of the interval [t1, t2]in the previous definitions,
so that we will be working with the space Λγ
1(0, t2],k·kΛγ
1(0,t2]. The following lemma proves the well–
posedness of (11). We refer to [8, Theorem 9.24] for an analogous result concerning the Kolmogorov forward
equation in mild form associated with OU processes in infinite dimension corresponding to Brownian motions.
Theorem 1. Let α1
2,1and B: [0, T ]×RNRNbe a measurable and bounded vector field. Then for
every φ∈ BbRNand 0< t T, there exists a unique solution uφ
s(t, x), s [0, t], x RN,of (11) such
that uφ
t− (t, ·)Λγ
1(0, t], where γ= 1/(2α).
Proof. Let us fix φ∈ BbRN, t (0, T ],¯s(0, t]and introduce the map Γ1: Λγ
1(0,¯s]Λγ
1(0,¯s]given by
Γ1V(s, x) = Rts,tφ(x) + Zt
ts
Rts,r B(r, ·),>V(tr, ·)(x)dr, 0< s ¯s, x RN,(12)
4
for every VΛγ
1(0,¯s]. Notice that such an application is well defined and with values in Λγ
1(0,¯s], thanks to
the properties of Rdiscussed above, the dominated convergence theorem and the next computations based
on (10):
sup
xRNZt
ts
xjRts,r B(r, ·),>V(tr, ·)(x)drN C kBk0,T kVkΛγ
1(0,¯s]Zt
ts
dr
(r(ts))γ(tr)γ
4γ
1γNC kBk0,T kVkΛγ
1(0,¯s]s12γ,0< s ¯s, j = 1, . . . , N. (13)
Here C=C(α, A, Q)>0is the same constant as in (10), and the last inequality is obtained using the
bound
Zt
ts
dr
(r(ts))γ(tr)γ=(Zts
2
ts
+Zt
ts
2)dr
(r(ts))γ(tr)γ= 2 Zts
2
ts
dr
(r(ts))γ(tr)γ
2
1γ2
sγs
21γ=4γ
1γs12γ,(14)
where for the second equality we perform the substitution u= 2tsr. Estimates similar to those in (13)
allow to write, for every V1, V2Λγ
1(0,¯s],
sup
xRN|1V1Γ1V2) (s, x)|+ sup
xRNxj1V1Γ1V2) (s, x)
4γ
1γNkBk0,T s1γ+Cs12γkV1V2kΛγ
1(0,¯s],0< s ¯s, j = 1, . . . , N.
Hence we obtain
kΓ1V1Γ1V2kΛγ
1(0,¯s]4γ
1γNkBk0,T ¯s+C¯s1γkV1V2kΛγ
1(0,¯s].(15)
This shows that, for ¯ssufficiently small, the map Γ1is a contraction in Λγ
1(0,¯s]: we denote by V1its unique
fixed point. Now define
uφ
s(t, x) = Rs,tφ(x) + Zt
s
Rs,r B(r, ·),>V1(tr, ·)(x)dr, t ¯sst, x RN,(16)
and notice that uφ
ts(t, x) = V1(s, x),0< s ¯s, x RN.Therefore uφ
(t, ·)is the unique, local solution of
(11) (in the strip [t¯s, t]×RN) such that uφ
t− (t, ·)Λγ
1(0,¯s].
At this point, we can repeat the same procedure to construct the solution of (11) in the interval
[t2¯s, t ¯s], because the relation among constants in (15) –which is necessary to get a contraction– does
not depend on the initial condition. Specifically, we take φ1=uφ
t¯s(t, ·)C1
bRNand define the map
Γ2V(s, x) = Rts,t¯sφ1(x) + Zt¯s
ts
Rts,r B(r, ·),>V(tr, ·)(x)dr, ¯ss2¯s, x RN,
for every VΛγ
1[¯s, 2¯s]. Computations analogous to the ones in the previous step show that Γ2: Λγ
1[¯s, 2¯s]
Λγ
1[¯s, 2¯s]is a contraction: its unique fixed point is denoted by V2. Then we call
uφ1
s(t¯s, x) = Rs,t¯sφ1(x) + Zt¯s
s
Rs,r B(r, ·),>V2(tr, ·)(x)dr, t 2¯sst¯s, x RN;
notice that uφ1
ts(t¯s, x) = V2(s, x),¯sss, x RN, and that by the definition of φ1, one has
uφ1
t¯s(t¯s, ·) = uφ
t¯s(t, ·). Now we extend the function uφ
s(t, x)in (16) assigning
uφ
s(t, x) = (uφ
s(t, x), t ¯sst
uφ1
s(t¯s, x), t 2¯sst¯s, x RN.
5
摘要:

ProbabilitycomputationforhighdimensionalsemilinearSDEsdrivenbyisotropic stableprocessesviamildKolmogorovequationsAlessandroBondiAbstractSemilinear,Ndimensionalstochasticdierentialequations(SDEs)drivenbyadditiveLévynoiseareinves-tigated.Specically,given 212;1;theinterestisonSDEsdrivenby2 stable,...

展开>> 收起<<
Probability computation for highdimensional semilinear SDEs driven by isotropic -stable processes via mild Kolmogorov equations.pdf

共29页,预览5页

还剩页未读, 继续阅读

声明:本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。玖贝云文库仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知玖贝云文库,我们立即给予删除!
分类:图书资源 价格:10玖币 属性:29 页 大小:916.38KB 格式:PDF 时间:2025-05-02

开通VIP享超值会员特权

  • 多端同步记录
  • 高速下载文档
  • 免费文档工具
  • 分享文档赚钱
  • 每日登录抽奖
  • 优质衍生服务
/ 29
客服
关注