2 DINGDING DONG∗AND NITYA MANI†
of length Lhas no gap of size 2 −o(1/L) but does have gaps of size 2 −1/L1−εfor all ε > 0. More precisely,
we prove the following result.
Theorem 1.3. The following holds in the saturated configuration of a one-dimensional hard-core process on
an interval of length Lpacked by rods of length 2, for Lsufficiently large:
•with high probability, there are no gaps of size 2−o(1/L);
•for all a > 0, with positive probability, there exists a gap of size at least 2−a/L;
•for all ε > 0, with high probability, there exists a gap of size at least 2−1/L1−ε.
In the classical one-dimensional hard-core model described above, cars that fail to park have no effects on
future parking attempts. We will be interested in a variant of the hard-core model, motivated by the ghost
RSA process introduced in work of [19] studying sphere packings. Unlike the classical random sequential
addition process, where much is unknown even in 2 dimensions, the authors of [19] are able to analytically
derive the n-point correlation functions and limiting densities, exactly solving the ghost sphere packing model
in arbitrary dimension.
We study the one-dimensional ghost hard-core model, akin to the hard-core model above, focusing on
properties of this process in the infinite time limit. We give a precise definition below:
Definition 1.4. We attempt to place rods of length 2 on an interval of length L, as follows:
•Initialize X= [0, L] and Y=∅.
•For t= 1,2, . . . :
–If the X\Yhas no connected component of length 2, abort.
–Choose a uniformly random point x∈[0, L].
–If x∈[0,1) ∪(L−1, L], reject x, replace Yby Y∪((x−1, x + 1) ∩[0, L]), and continue.
–If (x−1, x + 1) ∩Y6=∅, reject x, replace Yby Y∪(x−1, x + 1), and continue.
–Else, accept x, replace Yby Y∪(x−1, x + 1), and continue.
Several further observations about the occupancy probabilities and the pair correlation function associated
to this process can be found in Appendix A.
In the infinite time limit of the one-dimensional ghost hard-core process (on an interval of length L), with
high probability there is no gap of size log L, but there are gaps of size at least (log L)1−εfor arbitrarily
small ε > 0. More precisely, we have the following:
Theorem 1.5. The following holds in the infinite time limit of a one-dimensional ghost hard-core process
on an interval of length Lpacked by rods of length 2, for Lsufficiently large:
•with high probabiity, all gaps are smaller than log L;
•for all ε > 0,with high probability, there exists a gap of size (log L)1−ε.
Overview of article. We begin in Section 2 by reviewing the classical one-dimensional hard-core model
and introducing the ghost RSA process of [19]. In Section 3 we prove Theorem 1.3. We prove Theorem 1.5
in Section 4. The ghost hard-core process is very different from the classical hard-core process; we illustrate
some differences to give some context in Appendix A.
One of our primary motivations for studying large gaps in these hard-core processes is to provide a glimpse
into what gaps might look like in a random sequential addition process in higher dimensions. Theorem 1.5
hints that in higher dimensions, in the infinite time limit, a ghost packing may still have room for many
more spheres/cubes to be packed without overlap; we discuss further in Section 5.
Acknowledgements. We would like to thank Henry Cohn and Salvatore Torquato for helpful discussions,
ideas for writing improvements, and several useful reference suggestions for background. NM was supported
by the Hertz Graduate Fellowship and by the NSF GRFP #2141064.
2. Preliminaries
2.1. Notation. Throughout this article, we consider packing rods of length 2 onto an interval of length L,
which we model by the closed interval [0, L]⊆R. Unless stated otherwise, we study configurations in the
infinite time limit of the two processes, the 1D classical hard-core model and the 1D ghost hard-core model.
We sometimes refer to the infinite time limit of the 1D classical hard-core model as saturation, since
at this limit, no more rods can be packed without violating the hard-core constraint. We sometimes omit