
conditions, then to a 2d example on the flat Torus
T2
, but with more complex initial conditions.
While these are toy examples, they demonstrate the value of our method in comparison to existing
approaches—namely that we can exactly satisfy the continuity equation and preserve exact mass.
The Euler equations of incompressible flow
The incompressible Euler equations [Feynman et al.,
1989] form an idealized model of inviscid fluid flow, governed by the system of partial differential
equations1
∂ρ
∂t +div(ρu)=0,∂u
∂t +∇uu=∇p
ρ,div(u)=0 (10)
in three unknowns: the fluid velocity
u(t, x)∈R3
, the pressure
p(t, x)
, and the fluid density
ρ(t, x)
.
While the fluid velocity and density are usually given at
t= 0
, the initial pressure is not required.
Typically, on a bounded domain
Ω⊆Rn
, these are supplemented by the free-slip boundary condition
and initial conditions
u·n= 0 on ∂Ωu(0, x) = u0and ρ(0, x) = ρ0on Ω(11)
The density
ρ
plays a critical role since in addition to being a conserved quantity, it influences the
dynamics of the fluid evolution over time. In numerical simulations, satisfying the continuity equation
as closely as possible is desirable since the equations in (10) are coupled. Error in the density feeds
into error in the velocity and then back into the density over time. In the finite element literature,
a great deal of effort has been put towards developing conservative schemes that preserve mass
(or energy in the more general compressible case)—see Guermond and Quartapelle [2000] and the
introduction of Almgren et al. [1998] for an overview. But since the application of physics informed
neural networks (PINNs) to fluid problems is much newer, conservative constraints have only been
incorporated as penalty terms into the loss [Mao et al.,2020,Jin et al.,2021].
5.1 Physics informed neural networks
Physics Informed Neural Networks (PINNs; Raissi et al. [2019,2017]) have recently received renewed
attention as an application of deep neural networks. While using neural networks as approximate
solutions to PDE had been previously explored (e.g in Lagaris et al. [1998]), modern advances in
automatic differentiation algorithms have made the application to much more complex problems
feasible [Raissi et al.,2019]. The “physics” in the name is derived from the incorporation of physical
terms into the loss function, which consist of adding the squared residual norm of a PDE. For example,
to train a neural network
φ= [ρ, p, u]
to satisfy the Euler equations, the standard choice of loss to fit
to is
LF=
ut+∇uu+∇p
ρ
2
ΩLdiv =kdiv(u)kΩLI=ku(0,·)−u0(·)k2
Ω+kρ(0,·)−ρ0(·)kΩ
LCont =
∂ρ
∂t +div(ρu)
2
ΩLG=ku·nk2
∂ΩLtotal =γ·[LF, LI, Ldiv, LCont, LG]
where
γ= (γF, γI, γdivγCont, γG)
denotes suitable coefficients (hyperparameters). The loss term
LG
ensures fluid does not pass through boundaries, when they are present. Similar approaches were
taken in [Mao et al.,2020] and [Jagtap et al.,2020] for related equations. While schemes of this
nature are very easy to implement, they have the drawback that since PDE terms are only penalized
and not strictly enforced, one cannot make guarantees as to the properties of the solution.
To showcase the ability of our method to model conservation laws, we will parameterize the density
and vector field as
v= [ρ, ρu]
, as detailed in Section 3. This means we can omit the term
LCont
as
described in Section 5.1 from the training loss. The divergence penalty,
Ldiv
remains when modeling
incompressible fluids, since
u
is not necessarily itself divergence-free – it is
v= [ρ, ρu]
which is
divergence free. In order to stablize training, we can modify the loss terms
LF, LG, LI
to avoid
division by ρ. This is detailed in Appendix B.2.
5.2 Incompressible variable density inside the 3D unit ball
We first construct a simple example within a bounded domain, specifically, we will consider the Euler
equations inside B(0,1) ⊆R3, with the initial conditions
ρ(0, x) = 3/2− kxk2v(0, x)=(−2, x0−1,1/2) (12)
1
The convective derivative appearing in equation 10,
∇uu(x) = limh→0
u(x+hu(x))−u(x)
h= [Du](u)
is
also often written as (∇ · u)u.
5