2
computational chemistry with fewer, noisier qubits, but
the number of measurements needed to estimate energies
is often significant, making scaling of the algorithm to
large molecules a challenge [22–25].
The challenges for both QPE and VQE worsen even
further when they are considered in the context of solving
practical problems in quantum chemistry. The electronic
structure problem assumes a fixed molecular configuration,
and therefore a fixed molecular Hamiltonian, of which
we compute the ground state. In order to determine
many dynamic or structural properties of molecules, such
as reaction barriers and optimal geometries, a molecule
must be studied in many different configurations. In gen-
eral, characterizing this behaviour requires knowledge of
a family of ground states for a set of Hamiltonians param-
eterized by classical nuclear coordinates
H
(
R
). This is an
arduous task, which requires computing many different
ground states with corresponding energies lying on a high-
dimensional potential energy surface. Running quantum
algorithms such as QPE or VQE for each configuration in-
dependently would represent a significant computational
cost even for molecules with a modest number of atoms.
We propose in this paper an alternative method for
computing ground states corresponding to a wide range
of molecular configurations, i.e., for reconstructing po-
tential energy surfaces of molecules. A key motivation
behind this work is that while the fixed nuclei electronic
structure problem is already difficult, the ultimate goal of
using quantum computers to compute accurate electronic
energies requires sampling over many different nuclear
configurations in a proposed chemical reaction coordinate.
Therefore, the generation of many ground states, and
subsequently energies and other properties, is of central
interest in taking advantage of quantum computers for
designing new materials and technologies. Instead of
computing the ground states for many discrete molecular
configurations independently, our algorithm uses a limited
collection of data and, employing techniques from ma-
chine learning, builds a model that prepares the ground
state over some region in parameter space.
Our work connects with recent progress for the task
of learning from quantum mechanical data with both
classical and quantum methods, an increasingly active
area of research. Notable results include demonstration
that classical machine learning techniques are provably
efficient for predicting and modelling certain properties
of quantum many-body systems [
26
], and that quantum
learning procedures can be more efficient than classical
learning procedures for determining specific properties of
certain unknown quantum states and processes [
26
–
28
].
Our proposed algorithm shares some similarities with the
above works, with the key difference being that the output
of our model is a quantum state, rather than an estimate
of some observable quantity or a classical approximation
of a quantum state [
29
,
30
]. Therefore, algorithms of
the form we proposed can be used to extract arbitrary
ground state observables, or to output states that can
be used in other quantum computational procedures re-
quiring access to ground states. From the perspective of
Ref. [
31
], this model falls into the quantum-quantum or
“QQ” category of machine learning techniques: a quantum
model trained with quantum data, and complements the
growing literature on using quantum data and quantum
machine learning to understand quantum systems (in par-
ticular, quantum chemical systems). Existing examples of
QQ machine learning include learning excited states from
ground state [
32
], compression of quantum data [
33
], and
learning of parametrized Hamiltonians [
34
], which has
been applied to spin and molecular Hamiltonians under
the name quantum meta-learning [35].
The algorithm proposed in this work is a hybrid
classical-quantum generative model, in which we train a
classical neural network to yield parameters, which when
fed into a low-depth variational quantum circuit, approxi-
mate the corresponding ground state of
H
(
R
)for a range
of values of
R
. To train our model, we assume access to
quantum data: ground states of
H
(
R
)for a collection of
coordinates
{Ri}N
i=1
, which can be loaded into a quantum
computer. Since ground state preparation is a resource
intensive task, the amount of quantum data needed to
learn a model is a key metric for quantifying the efficiency
and the feasibility of the algorithm. Ideally, a model of
this form should generalize to new values of
R
not con-
tained in the training data. This allows us to generate
approximations of previously unseen molecular ground
states, at the more modest price of executing a shallow,
variational quantum circuit for some set of parameters
determined by a classical neural network.
To test these capabilities, we perform extensive numeri-
cal experiments for a collection of different molecules, and
find that even with few data points, there is good gen-
eralization to unseen geometries in the potential energy
surface. Ultimately, the aim of our proposal is to provide
a concrete first step towards the development of practi-
cal techniques based on quantum machine learning for
alleviating the cost of computing ground states of a param-
eterized molecular Hamiltonian. Under this framework,
hard-to-obtain quantum data, originating from quantum
algorithms or physical experiments [
36
], is the resource
that we attempt to leverage.
It is important to note that the particular generative
model proposed is only one member of a large family
of quantum-classical machine learning architecture for
preparing ground states. Advanced models may utilize
more sophisticated choices of cost function, classical neu-
ral network architecture, initialization, and circuit con-
struction than those considered in this paper. The goal
of this work is to both discuss the general concept of
generative quantum machine learning applied to quantum
chemistry, as well as provide a concrete example of what
such a model would look like. As a result, we explore
both practicalities associated with the particular model,
including gradient sample complexity and numerical ex-
periments, as well as more general considerations about
abstract quantum state-learning procedures.
We begin in Sec. II by outlining a general architecture