embedded as a normally-stable elliptic slow manifold in a nearly-periodic Hamiltonian system
8
. Highly-oscillatory Hamiltonian
systems exhibit two basic structural properties whose interactions play a crucial role in their long-term dynamics. First is
preservation of the symplectic form, as for all Hamiltonian systems. Second is timescale separation, corresponding to the
relatively short timescale of oscillations compared with slower secular drifts. Coexistence of these two structural properties
implies the existence of an adiabatic invariant
8–11
. Adiabatic invariants differ from true constants of motion, in particular
energy invariants, which do not change at all over arbitrary time intervals. Instead adiabatic invariants are conserved with
limited precision over very large time intervals. There are no learning frameworks available today that exactly preserve the
two structural properties whose interplay gives rise to adiabatic invariants. This work addresses this challenge by exploiting
a recently-developed theory of nearly-periodic symplectic maps
11
, which can be thought of as discrete-time analogues of
highly-oscillatory Hamiltonian systems9.
As a result of being symplectic, a mapping assumes a number of special properties. In particular, symplectic mappings are
closely related to Hamiltonian systems: any solution to a Hamiltonian system is a symplectic flow
12
, and any symplectic flow
corresponds locally to an appropriate Hamiltonian system
13
. It is well-known that preserving the symplecticity of a Hamiltonian
system when constructing a discrete approximation of its flow map ensures the preservation of many aspects of the dynamical
system such as energy conservation, and leads to physically well-behaved discrete solutions over exponentially-long time
intervals13–17. It is thus important to have structure-preserving neural network architectures which can learn symplectic maps
and ensure that the learnt surrogate map preserves symplecticity. Many physics-informed and structure-preserving machine
learning approaches have recently been proposed to learn Hamiltonian dynamics and symplectic maps
2,3,18–35
. In particular,
Hénon Neural Networks (HénonNets)
2
can approximate arbitrary well any symplectic map via compositions of simple yet
expressive elementary symplectic mappings called Hénon-like mappings. In the numerical experiments conducted in this
paper, HénonNets
2
will be our preferred choice of symplectic map approximator to use as building block in our framework
for approximation of nearly-periodic symplectic maps, although some of the other approaches listed above for approximating
symplectic mappings can be used within our framework as well.
As shown by Kruskal
9
, every nearly-periodic system, Hamiltonian or not, admits an approximate
U(1)
-symmetry, deter-
mined to leading order by the unperturbed periodic dynamics. It is well-known that a Hamiltonian system which admits a
continuous family of symmetries also admits a corresponding conserved quantity. It is thus not surprising that a nearly-periodic
Hamiltonian system, which admits an approximate symmetry, must also have an approximate conservation law
11
, and the
approximately conserved quantity is referred to as an adiabatic invariant.
Nearly-periodic maps, first introduced by Burby et al.
11
, are natural discrete-time analogues of nearly-periodic systems,
and have important applications to numerical integration of nearly-periodic systems. Nearly-periodic maps may also be used
as tools for structure-preserving simulation of non-canonical Hamiltonian systems on exact symplectic manifolds
11
, which
have numerous applications across the physical sciences. Noncanonical Hamiltonian systems play an especially important
role in modeling weakly-dissipative plasma systems
36–42
. Similarly to the continuous-time case, nearly-periodic maps with
a Hamiltonian structure (that is symplecticity) admit an approximate symmetry and as a result also possess an adiabatic
invariant
11
. The adiabatic invariants that our networks target only arise in purely Hamiltonian systems. Just like dissipation
breaks the link between symmetries and conservation laws in Hamiltonian systems, dissipation also breaks the link between
approximate symmetries and approximate conservation laws in Hamiltonian systems. We are not considering systems with
symmetries that are broken by dissipation or some other mechanism, but rather considering systems which possess approximate
symmetries. This should be contrasted with other frameworks
43–45
which develop machine learning techniques for systems that
explicitly include dissipation.
We note that neural network architectures designed for multi-scale dynamics and long-time dependencies are available
46
,
and that many authors have introduced numerical algorithms specifically designed to efficiently step over high-frequency
oscillations
47–49
. However, the problem of developing surrogate models for dynamical systems that avoid resolving short
oscillations remains open. Such surrogates would accelerate optimization algorithms that require querying the dynamics of
an oscillatory system during the optimizer’s “inner loop". The network architecture presented in this article represents a first
important step toward a general solution of this problem. Some of its advantages are that it aims to learn a fast surrogate
model that can resolve long-time dynamics using very short time data, and that it is guaranteed to enjoy symplectic universal
approximation within the class of nearly periodic maps. As developed in this paper, our method applies to dynamical systems
that exhibit a single fast mode of oscillation. In particular, when initial conditions for the surrogate model are selected on the
zero level set of the learned adiabatic invariant, the network automatically integrates along the slow manifold50–54. While our
network architecture generalizes in a straightforward manner to handle multiple non-resonant modes, it cannot be applied to
dynamical systems that exhibit resonant surfaces.
2/21