optimization problems [23, 22, 71]. The initial interest from the operations research community was signif-
icant. However, through careful comparison with both complete search solvers [59, 67, 18] and specialized
heuristics [73, 8, 72, 56, 55, 70, 36, 2], it was determined that the available QA hardware was a far cry
from state-of-the-art optimization methods. These results tempered the excitement around the QA com-
puting model and reduced interest from the operations research community. Since the waning of the initial
excitement around QA, QA hardware has steadily improved and now features better noise characteristics
[78, 79, 47] and quantum computers that can solve optimization problems more than fifty times larger than
what was possible in 2013 [58].
Since 2017, we have been using the benchmarking practices of operations research to track the perfor-
mance of QA hardware platforms and compare the results with established optimization algorithms [11, 66].
In previous studies of this type, this benchmarking approach ruled out any potential performance benefit
for using available QA hardware platforms in hybrid optimization algorithms and practical applications, as
established algorithms outperformed or were competitive with the QA hardware in both solution quality
and computation time. However, in this work, we report that with the release of D-Wave Systems’ most
recent Advantage Performance Update computer in 2021, our benchmarking approach can no longer rule out
a potential run time performance benefit for this hardware. In particular, we show that there exist classes
of combinatorial optimization problems where this QA hardware finds high-quality solutions around fifty
times faster than a wide range of heuristic algorithms under best-case QA communication overheads and
around fifteen times faster under real-world QA communication overheads. This work thus provides com-
pelling evidence that quantum computing technology has the potential for accelerating certain combinatorial
optimization tasks. This represents an important and necessary first condition for demonstrating that QA
hardware can have an impact on solving practical optimization problems.
Although this work demonstrates encouraging results for the QA computing model, we also emphasize
that it does not provide evidence of a fundamental or irrefutable performance benefit for this technology.
Indeed, it is quite possible that dramatically different heuristic algorithms [21, 63] or alternative hardware
technologies [60, 32, 57, 53] can reduce the run time performance benefits observed in this work. We
look forward to and encourage ongoing research into benchmarking the QA computing model, as closing
the performance gap presented in this work would provide significant algorithmic insights into heuristic
optimization methods, benefiting a variety of practical optimization tasks.
This work begins with a brief introduction to the types of combinatorial optimization problems that can
be solved with QA hardware and the established benchmarking methodology in Section 2. It then presents a
summary of the key outcomes from a large-scale benchmarking study in Section 3, which required hundreds
of hours of compute time. In Section 4, the paper concludes with some discussion of the limitations of our
results and future opportunities for QA hardware in combinatoral optimization. Additional details regarding
the experimental design, as well as further analyses of computational results, are provided in the appendices.
2 Quantum Annealing for Combinatorial Optimization
Available QA hardware is designed to perform optimization of a class of problems known as Ising models,
which have historically been used as fundamental modeling tools in statistical mechanics [28]. Ising models
are characterized by the following quadratic energy (or objective) function of N={1,2, . . . , n}discrete spin
variables, σi∈ {−1,1},∀i∈ N :
E(σ) = X
(i,j)∈E
Jij σiσj+X
i∈N
hiσi,(1)
where the parameters, Jij and hi, define the quadratic and linear coefficients of this function, respectively.
The edge set, E ⊆ N × N , is used to encode a specific sparsity pattern in the Ising model, which is
determined by the physical system being considered. The optimization task of interest is to find the lowest
energy configuration(s) of the Ising model, i.e.,
minimize
σE(σ)
subject to σi∈ {−1,1},∀i∈ N .(2)
At first glance, the lack of constraints and limited types of variables make this optimization task appear
distant from real-world applications. However, the optimization literature on quadratic unconstrained binary
2