Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Asymptotics For The Number of Bipartite Graphs With Fixed Surplus

Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

Asymptotics for the number of bipartite graphs with

fixed surplus
arXiv:2411.09419v1 [math.CO] 14 Nov 2024

David Clancy, Jr.


November 15, 2024

Abstract
In a recent work on the bipartite Erdős-Rényi graph, Do et al. (2023) established
upper bounds on the number of connected labeled bipartite graphs with a fixed sur-
plus. We use some recent encodings of bipartite random graphs in order to provide a
probabilistic formula for the number of bipartite graphs with fixed surplus. Using this,
we obtain asymptotics as the number of vertices in each class tend to infinity.

1 Introduction
Cayley’s formula gives the number of trees on n labeled vertices as nn−2 . Equivalently, this
counts the number of spanning trees of Kn , the complete graph on n vertices. Let Gn (k)
denote the collection of connected spanning subgraphs of Kn with exactly n − 1 + k many
edges. In [23], Wright established that for each fixed k
3k
#Gn (k) ∼ ρk nn−2+ 2
for some constants ρk > 0. Here, and throughout the article, we write an ∼ bn if an /bn → 1
as n → ∞. In [17], Spencer gave a probabilistic representation of ρk as
"Z
1 k #
1
ρk = E Bex ds
k! 0

where Bex is a standard Brownian excursion. See [10] for a more thorough literature review
of this connection. See also [14] for the case of k = kn → ∞ sufficiently slowly.
Let Kn,m be the complete bipartite graph on n+m labeled vertices, where one class has n
vertices while the other has m vertices. We let Gn,m (k) be the collection of spanning graphs
of Kn,m with exactly n + m − 1 + k many edges. Scoins [16] established that
#Gn,m (0) = nm−1 mn−1 . (1.1)
Recently, in [8, 6], analogues of the result of Wright were established for k fixed and n, m →
∞. Using generating functions, the authors of [8] show that for each fixed k that
 
X N 1
#Gn,m (k) ∼ k−1 #GN (k) as N → ∞.
n,m:n+m=N
n 2

1
In [6], the authors consider local versions and show that as n, m → ∞ with n/m ∈ [1/2, 2]

π m− 1 n− 1 √
r
#Gn,m (1) ∼ n 2m 2 n + m and #Gn,m (k) ≤ ck (n + m)3k/2 nm−1 mn−1
8
for some constants ck → 0 as k → ∞. The authors of [6] can obtain an explicit representation
for the asymptotics #Gn,m (1) as the 2-core of any graph in Gn,m (1) is a cycle of even length.
In this article, we obtain the asymptotics so long as n/m → α ∈ R+ and k is fixed. More
precisely, we establish the following theorem.
Theorem 1.1. Suppose that n, m → ∞ and n/m → α ∈ R+ . Then

#Gn,m (k) ∼ (1 + α)k/2 ρk nm−1+k/2 mn−1+k .

This yields the following corollary.


n
Corollary 1.2. Suppose that n+m
→ γ ∈ (0, 1) as n → ∞. Then

#Gn,m (k) ∼ (γ(1 − γ))k/2 ρk (n + m)3k/2 nm−1 mn−1 .

1.1 Overview
Our proof will be almost entirely probabilistic, in the spirit of Spencer [17].
In Section 2 we describe the (breadth-first) exploration of a graph G ∈ Gn,m (k) for some
k ≥ 0. This exploration gives us two processes X # , X encoding the number of vertices
discovered by each vertex in the exploration. In Section 2.1, we describe the law of these
processes when the graph G is a uniformly chosen tree in Gn,m (0). In Section 2.2, we relate
#Gn,m (k)/#Gn,m (0) to the expectation of a particular random variable Wn,m .
In Section 3 we discuss weak convergence. In Section 3.1, we prove weak convergence
involving some auxiliary processes Y # , Y . These processes are connected to X # , X in
Section 3.3 and to the process Wn,m in Section 3.4. In Section 3.4 we prove the convergence
of the moments of Wn,m in order to obtain Theorem 1.1.

2 Exploration of graphs
F
Let us now explain the exploration of a graph G ∈ k≥0 Gn,m (k). For concreteness, we color
the vertices of Kn,m as either white or black. We write the vertex set of Kn,m as Vn# ⊔ Vm
where Vn# = {i# : i ∈ [n]} are the white vertices and Vm = {i : i ∈ [m]} are the black
vertices. The exploration is analogous to the explorations in [7, 20, 5].
We maintain a stack of active vertices that we denote by At for t = 0, 1, · · · , n + m.
We always start with the stack A0 containing the vertex v1 := 1# . We will define several
sequences (χ#j ; j ∈ [n]), (χj ; j ∈ [m]), and (γj ; j ∈ [n + m]). We will use these sequences to
define several processes
t
X t
X t
X
X # (t) = χ#s , X (t) = χs , N # (t) = γs , N (t) = t − N # (t). (2.1)
s=1 s=1 s=1

2
Exploration 1 (Breadth-first exploration of G). For t = 1, 2, · · · , n + m the stack At−1 =
(x1 , · · · , xs ) is of length s ≥ 1 (by induction). By step t, we have explored a := N # (t − 1)
many white vertices and b := N (t − 1) many black vertices. We now explore vertex x1 .
if: x1 = va+1 ∈ Vn# , then we find the neighbors of x1 that are either in At or have been
unexplored. Each neighbor in At corresponds to a cycle created. The unexplored ones will
be elements of Vm and we will label these as wb+1 , · · · , wb+r for some r where each wb+j = ij
for some i1 < i2 < · · · < ir . Set χ#u = r and update At = (x2 , · · · , xs , wb+1 , · · · , wb+r ). Set
γt = 1.
else: x1 = wb+1 ∈ Vm , then we find the neighbors of x1 that are either in At or have
been unexplored. Each neighbor in At corresponds to a cycle created. The unexplored ones
will be elements of Vm and we will label these as va+1 , · · · , va+r′ for some r ′ where wa+j = i#j
for some i1 < i2 < · · · < ir′ . Set χu = r ′ and update At = (x2 , · · · , xs , vb+1 , · · · , vb+r′ ). Set
γt = 0.

We call the pair (X # , X ) the child count processes defined via (2.1) associated with the
graph G. Moreover, it is easy to see that X ◦ X # (t) − X ◦ X # (t − 1) is the number of
white grand-children of the white vertex vt . Hence, standard properties of random trees and
Lukasiewicz paths (see e.g. [12]) imply that Z(t) = X ◦ X # (t) − t for t = 0, 1, 2, · · · , n + m
has increments in {−1, 0, 1, · · · } and satisfies

Z(t) ≥ 0 for all t = 0, 1, · · · , n − 1 and


Z(n) = −1. (2.2)
Pn
Pm Observe that the pair (χ#
, χ ) constructed in Exploration 1 satisfies j=1 χj = m and
#

j=1 χj = n − 1 by simply noting which vertices are added to the stack At . Given any two
sequences (χ# , χ ) of non-negative integers, we can define X # , X using (2.1); however, this
need not correspond to a tree T ∈ Gn,m (0). Using the bijection between Lukasiewicz paths
and planar trees (see e.g. [12]), we can see that there exists a unique planar tree T plan built
from (χ# , χ ) whenever the Lukasiewicz path Z(t) = X ◦ X # (t) − t satisfies (2.2). We will
call such sequences (χ# , χ ) admissible.

3 7
7 1 4 2 6
4 6 2 5
3 5 8
1
Figure 1: A tree T on 7 labeled white vertices and 8 labeled black vertices. When exploring
any graph G whose exploration produces the tree T above, the possible edges that need to
be checked to find cycles are (listed in order of possible appearance) 5 4# , 5 6# , 8 4# , 8 6# ,
6# 7 , 2# 7 , 2# 1 , 2# 4 , 5# 7 , 5# 1 , 5# 4 , 5# 2 , 5# 6 , 1 3# , 4 3# , 4 7# ,2 3# , 2 7# ,6 3# ,
6 7# .

3
8 8

7 7

6 6

5 5
X # (t)

X (t)
4 4

3 3

2 2

1 1

0 0
0 1 2 3 4 5 6 7 8 t 0 1 2 3 4 5 6 7 8 t

3
Z(t)

0
0 1 2 3 4 5 6 7 t

Figure 2: The child count processes for the tree T depicted in Figure 1, along with its (white)
Lukasiewicz path.

2.1 Properties of the exploration


The next lemma tells us the law of the child count sequences (X # , X ) for a uniform tree
T ∈ Gn,m (0).
P
Lemma 2.1. Let (χ# , χ ) be admissible. Define X # (t) = s≤t χ#s and similarly define X .
The number of trees T ∈ Gn,m (0) whose child count processes are X # , X is
  
(n − 1)! m! n−1 m
Qm Qn = .
j=1 χj ! j=1 χj ! χ1 , χ2 , · · · , χm χ#1 , χ#2 , · · · , χ#n
#

Proof. Let T plan be the rooted planar tree constructed from (χ# , χ ). This is uniquely
defined by (χ# , χ ). We must now assign labels to the vertices in T plan that are consistent
with Exploration 1 above. The first multinomial coefficient counts the number of ways to
assign the n − 1 labels to the children of the black vertices in increasing order, the second
counts the number of ways to assign labels to the children of white vertices.
The next lemma gives a probabilistic way to construct X # and X .

4
Lemma P 2.2. Let (ξj ; j ≥ 1), (ξj# ; j ≥ 1) be i.i.d. mean 1 Poisson random variables. Define
Y # (t) = s≤t ξs# and similarly define Y . Let S(t) = Y ◦ Y # (t) − t. Let X # , X be the
child count processes for a uniformly chosen random tree T ∈ Gn,m (0). Let

En,m = {inf{t : S(t) = −1} = n} ∩ {Y # (n) = m}.


d
Then (Y # , Y )|En,m = (X # , X ).

Proof. Let (x# , x ) be a fixed (deterministic) admissible sequence. The previous lemma gives
!
X X (n − 1)!m! 1
P X # (t) = x#s , X (t) = xs = Qn Qm .
s≤t s≤t
#G n,m (0) s=1 x#
s ! u=1 xu !

Note that given En,m , Y (m) = n − 1 a.s. Hence,if (x , x# ) is admissible we have


!
X X 1 X X
P Y # (t) = x#s , Y (t) = xs En,m = P(Y # (t) = x#s , Y (t) = xs , En,m )
s≤t s≤t
P(En,m ) s≤t s≤t
n m
1 1 Y e−1 Y e−1
= P(ξs = xs , ξu = xu for all s, u) =
# #
.
P(En,m ) P(En,m ) s=1 x#s u=1 xu
Qn Qm
Both are inversely proportional to s=1 xs !
#
u=1 xu !, proving the desired statement.
−(n+m) m−1 n−1
Since #Gn,m (0) = nm−1 mn−1 we can see that P(En,m ) = e (n−1)!m! n m
. We now give a
probabilistic proof of this, and hence a probabilistic proof of (1.1).

Lemma 2.3. Let En,m be defined in Lemma 2.2. Then

e−(n+m) nm−1 mn−1


P(En,m ) = .
(n − 1)!m!

Proof. As already noted, we have

En,m = {Y # (n) = m} ∩ {Y (m) = n − 1} ∩ {inf{t : S(t) = −1} = n}.

Set An,m = {Y # (n) = m} ∩ {Y (m) = n − 1}. Since Y # (n) ∼ Poi(n) and Y (m) ∼ Poi(m)
−n m −m mn−1
are independent Poisson random variables, P(An,m ) = e m!n e (n−1)! . Hence

nm mn−1
P(En,m ) = P(inf{t : S(t) = −1} = n|An,m )e−(n+m) .
m!(n − 1)!

Under P(−|An,m ) P the increments Wj = S(j) − S(j − 1) for j = 1, · · · , n are cyclically


exchangeable and nj=1 Wj = −1. Hence, by the cyclic lemma (see, e.g. [15, Lemma 6.1])
P(inf{t : S(t) = −1} = n|An,m ) = n−1 . This gives the desired result.

Remark 2.4. We discuss this cyclic lemma in more detail in Section 3.3.

5
2.2 Counting graphs
Let us now turn to the graph counting. In the tree in Figure 1, we see that there are 20
possible edges to T to form a graph G with the same breadth-first spanning tree. It turns
out we can represent this as a functional of the child count processes X # , X . As this only
depends on the labeled tree T , we set
n−1
X m−1
X
W = W (T ) = −m(n − 1) + X (s) +
#
X (u) (2.3)
s=0 u=0

where X # , X is the child count process of T . It is easy to see that using Figure 2 that for
the tree T in Figure 1 that
7−1
X 8−1
X
X (s) = 37
#
and X (u) = 31,
s=0 u=0

and, therefore, W (T ) = 37 + 31 − 48 = 20.


Proposition 2.5. Suppose that T ∈ Gn,m  (0) is fixed. Then the number of graphs G ∈
W (T )
Gn,m (k) whose spanning tree is T is k
. In particular,
 
W (T )
#Gn,m (k) = E nm−1 mn−1 where T ∼ Unif(Gn,m (0)).
k
Proof. Observe that in the exploration, a surplus edge can be added exclusively when we are
exploring a white vertex (resp. black vertex) at time t and pair it with a black vertex (resp.
white vertex) in the stack At−1 .
Let us now look at the stack At−1 . The top of the stack is white if γt = 1 and black if γt =
0. The vertices that have been discovered up-to and including step t − 1 are those discovered
by vertices (vs ; s ≤ 1+N # (t−1)) and (wu : u ≤ N (t−1)). Moreover, those vertices have all
been removed from the stack by time t. Hence, At−1 consists of X # (N # (t − 1)) − N (t − 1)
many black vertices and 1 + X (N (t − 1)) − N # (t − 1) many white vertices. Therefore, the
total number of possible cycles that can be added is
n+m
X

W = [(X # ◦ N # (t − 1) − N (t − 1))γt + (X ◦ N (t − 1) − N # (t − 1))(1 − γt )] .
t=1
′
Hence there are Wk many graphs G ∈ Gn,m (k) whose spanning tree is T . We claim that
W ′ = W (T ). To see this, note that for each s = 0, 1, · · · , n − 1 there is precisely one
t = 1, · · · , n + m such that s = N # (t − 1) and γt = 1. Hence, for any f
n+m
X n−1
X
f (N (t − 1))γt =
#
f (s).
t=1 s=0
Pn+m Pm−1
Similarly, t=1 f (N (t − 1))(1 − γt ) = s=0 f (s). Since N (t) = t − N (t) and γt =
#

N # (t) − N # (t − 1) we get
n
X n−1
X n+m
X
(X ◦ N (t − 1) − N (t − 1)) γt =
# #
(X (s) + s) −
#
(t − 1)γt.
t=1 s=0 t=1

6
A similar formula can be established for the other sums against (1 − γt ):
n
X m−1
X n+m
X
(1 + X ◦ N (t − 1) − N # (t − 1)) (1 − γt ) = (1 + X (s) + s) − (t − 1)(1 − γt ).
t=1 s=0 t=1

Hence
n−1 m−1     n+m

X X n m X
W = X (s) +
#
X (s) + + +m− (t − 1)(γt + 1 − γt )
s=0 s=0
2 2 t=1
n−1 m−1      
X X n m n+m
= X (s) +
#
X (s) + + +m− = W (T ).
s=0 s=0
2 2 2

3 Scaling Limits
We are now left to investigate the random variable Wn,m := W (T ) where T ∼ Unif(Gn,m (0)).
We will henceforth include the subscript n in all the processes and assume that m = mn also
depends on n. To do this, it is easier to start with the scaling limits for Yn# and Yn from
Lemma 2.2. We will also let An,m be the event
An,m = {Yn# (n) = m, Yn (m) = n − 1}. (3.1)

3.1 Fluctuations of Poisson Bridges


We start with the following consequence of Donsker’s theorem for empirical processes. In
the sequel, given any function f : [n] → R we extend this to all of [0, n] ⊂ R by setting
f (t) = f (⌊t⌋).
Proposition 3.1. Let Y # = Yn# and Y = Yn be as in Lemma 2.2 for some sequence
m = mn → ∞ as n → ∞. Conditionally given An,m from (3.1), it holds jointly in the J1
topology
√ (d)
m m−1 Y # (nt) − t ; t ∈ [0, 1] An,m =⇒Bbr
  #
(3.2)
√ (d)
n n−1 Y (mt) − t ; t ∈ [0, 1] An,m =⇒Bbr
 
(3.3)

for two independent Brownian bridges Bbr , ∗ ∈ {#, }.
Proof. Let UPj be independent and uniformly distributed on [0, 1]. For each n ≥ 1, define
−1 n
Fn (t) = n j=1 1[Uj ≤t] . By Donsker’s theorem [13]
√  (d)
n(Fn (t) − t); t ∈ [0, 1] =⇒Bbr
for a Brownian bridge Bbr . Also, by standard properties about Poisson processes and uniform
random variables, it is easy to see that
d
m−1 Y # (t); t = 0, 1, · · · , n An,m = (Fm (t/n); t = 0, 1, · · · , n)

(3.4)
An application of [4, pg 146] that gives the convergence (3.2). The limit in (3.3) is similar.

7
In the sequel, we will need some more precise control on the growth of the processes
m−1 Yn# (nt) − t and n−1 Yn (mt) − t. As in the proof of Proposition 3.1, we can do this by a
uniform empirical process. The following lemma follows from the proof Lemma 13 in [1].
Lemma 3.2. Let (Uj ; j ≥ 1) be i.i.d. uniform [0, 1]. Then there is a universal constant
C, λ > 0 such that for all n ≥ 1
!

P sup | n(Fn (t) − t)| ≥ x ≤ C exp(−λx2 ).
t∈[0,1]

Looking at (3.4) the previous lemma gives the following.


Corollary 3.3. Suppose the assumptions of Proposition 3.1. There exists constants C, λ > 0
such that
!
√ √ 2
P sup m(m−1 Yn# (nt) − t) + sup n(n−1 Yn (mt) − t) > x An,m ≤ Ce−λx .
t∈[0,1] t∈[0,1]

3.2 Fluctuations of Sn
The convergence of the process Sn (t) = Yn ◦ Yn# (t) − t given An,m follows from Proposition
3.1 and a standard result on the fluctuations of compositions (see [18, 21] or [22, Section
13.3 ]). We recall this with the next lemma.
Lemma 3.4. Suppose that xn , yn : [0, 1] → [0, 1] are non-decreasing càdlàg functions, and
ψ, ϕ : [0, 1] → R are continuous functions. Let cn → ∞ and suppose that in the Skorohod J1
topology both (cn (xn (t) − t); t ∈ [0, 1]) → ψ and (cn (yn (t) − t); t ∈ [0, 1]) → ϕ. Then

(cn (xn ◦ yn (t) − t); t ∈ [0, 1]) → ψ + ϕ.

The preceding lemma and Proposition 3.1 imply the following.


Corollary 3.5. Let Y # = Yn# , Y = Yn and S = Sn be as in Lemma 2.2 for some sequence
m = mn → ∞ as n → ∞. Let An,m = {Y # (n) = m} ∩ {Y (m) = n − 1}. Suppose that
n/mn → α ∈ R+ . Then, jointly with the convergence in Proposition 3.1,
(d) √
(n−1/2 Sn (nt); t ∈ [0, 1])|An,m=⇒

αBbr
#
+ Bbr .
# #
Proof. Let Y n (t) = m−1 Yn# (nt) and Y n = n−1 Yn (mt). Then n−1 Sn (nt) = Y n ◦ Y n (t) − t.
By Proposition 3.1 we have
√    (d)
n Y n (t) − t ; t ∈ [0, 1] |An,m =⇒Bbr .
√ √ √
Since n/mn → α, we see that n ∼ α m and so
√ #
 (d) √
n(Y n (t) − t); t ∈ [0, 1] |An,m =⇒ αBbr#
.

The result now follows by Lemma 3.4.

8
3.3 The Vervaat Transform
In this section we discuss in more detail the connection between Sn |An,m and Sn |En,m where
En,m is defined in Lemma 2.2 as well as the connection to the scaling limits.
We start with the operation in the continuum. Let f : [0, 1] → R be a càdlàg process
without any negative jumps such that f (0) = f (1) = f (1−) = 0. Note that f need not
attain its global minimum; however, t 7→ f (t−) will. We let τ be the first time that f (t−)
attains the global infimum of f . That is
τ = inf{t : f (t−) = inf f (s)}
s∈[0,1]

We extend f : [0, 1] → R to f : R → R by setting f (t) = f ({t}) where {t} is the fractional


part of t. The Vervaat transform of f is
V (f )(t) = f (τ + t) − f (τ ).
In words, the Vervaat transform exchanges the pre- and post-infimum parts of f . See [2, 3, 19]
for more details. In [19], Vervaat proved that
d
(V (Bbr )(t); t ∈ [0, 1]) = (Bex (t); t ∈ [0, 1]) (3.5)
where Bex is a standard Brownian excursion.
The discrete Vervaat transform is defined slightly differently. First, consider the discrete
bridge f of length n from 0 to −1 which is of the form
k
X
fn (k) = (xj − 1), for k = 0, 1, · · · , n
j=1

where xj ∈ {0, 1, · · · } and suppose that fn (n) = −1. We call such a function fn a downward
skip-free bridge of length n. Similar to above, define
τn = min{k : fn (k) = min fn (j)}.
0≤j≤n

We define the discrete Vervaat transform Vn as


k
X
Vn (fn )(k) = (xj+τn − 1)
j=1

where the index j + τn is interpreted modulo n.


The following lemma is elementary. See [3, Lemma 3] or [11, Lemma 14].
Lemma 3.6. Suppose that (fn ; n ≥ 1) is a sequence of downward skip-free bridges of length
n and that f is a càdlàg bridge with no negative jumps. Suppose that 0 < δn → 0 is a
sequence of constants such that in the J1 topology
(δn fn (nt); t ∈ [0, 1]) → (f (t); t ∈ [0, 1]).
Suppose that f attains its global minimum uniquely and continuously, i.e. f (τ −) = f (τ ) <
f (t) for all t 6= τ . Then τn /n → τ and
(δn Vn (fn )(nt); t ∈ [0, 1]) −→ (V (f )(t); t ∈ [0, 1]).

9
An important property of the discrete Vervaat transform is how it interacts with ex-
changeable incrementPkprocesses. For this, it will be better to define cyclic shifts more gen-
erally. Let gn (k) = j=1 yj where yj ∈ R for j ∈ [n]. We define
k
X
θn (gn , i)(k) = yj+i
j=1

where, again, we interpret the index j + i as its equivalence class modulo n. Note

θn (θn (gn , i1 ), i2 ) = θn (gn , i1 + i2 ) (3.6)

and θn (fn , τn ) = Vn (fn ). Also, for i ∈ {0, 1, · · · , n − 1} we have


(
gn (k + i) − gn (i) :i+k ≤n
θn (gn , i)(k) = (3.7)
gn (k + i − n) − gn (i) + gn (n) : n < i + k ≤ i + n.

The following lemma easily follows from the above observation.


Lemma 3.7. For each i, θn (−, i) is linear on the collection of functions gn (k) = kj=1 yj .
P
Moreover, if gn (k) = ck for some constant c ∈ R, then θn (gn , i) = gn for all i. In particular,

θn (gn − c Id, i) = θn (gn , i) − c Id

where Id(k) = k for all k.


We include the following lemma containing the main results in Section 6.1 of [15].
Lemma 3.8. Suppose that ξj are i.i.d. random variables such that P(ξj ≥ 0) = 1 and
P( nj=1 (ξj − 1) = −1) > 0. Let S(k) = kj=1(ξj − 1). Let
P P

An = {S(n) = −1} and En = {S(n) = −1, S(k) ≥ 0 for k ≤ n − 1}.

Then the following hold


1. P(En ) = n−1 P(An ).

2. Let τn = inf{k : S(k) = mink≤n S(k)}. Then τn |An ∼ Unif{1, 2, · · · , n}.


d
3. Vn (S)|An = S|En .

4. Let Un ∼ Unif{1, 2, · · · , n} be independent of S. Then


d
(τn , θn (S, τn ))|An = (Un , S)|En .

Let us fix an n, m and let Xn# , Xn be the child count processes associated with a uniformly
chosen tree T ∈ Gn,m (0). Recalling Lemma 2.2, we see that
d
(Xn# , Xn , Zn ) = (Yn# , Yn , Sn )|En,m .

10
Recall that Yn = (Yn (k); k = 0, 1, · · · , m), Yn# = (Yn# (k); k = 0, 1, · · · , n) and

Sn (k) = Yn (Yn (k)) − k.

In particular, the (unconditioned) increments of Sn are i.i.d. From here, it is not hard to see
using Lemma 3.8(4) that if Un ∼ Unif{1, · · · , n} is independent of Yn# , Yn then
   
d
Un , Yn , Yn , Sn |En,m = τn , θn (Yn , τn ), θm (Yn , Yn (τn )), θn (Sn , τn ) |An,m
# # #

where τn = inf{k : Sn (k) = minj≤n Sn (j)}. Consequently, if Un is independent of Xn# , Xn


then
  
d
Un , Xn , Xn ) = τn , θn (Yn , τn ), θm (Yn , Yn (τn )) |An,m .
# # #

An application of Lemma 3.7 gives the following lemma.


Lemma 3.9. Maintain the notation above and let Id(k) = k for all k. Then
 m m  
d m n 
(Xn# − Id), (Xn − Id) = θn (Yn# , τn ) − Id, θm (Yn , Yn# (τn )) − Id |An,m .
n n n m

3.4 Scaling limit of Wn,m


Recall the definition of Wn,m in (2.3) for a uniform tree Tn ∈ Gn,m (0) for some sequence
mn → ∞. √ The purpose of this section
√ is to prove the following proposition. To state it, we
let S(t) = αBbr (t) + Bbr (t) = 1 + αBbr (t) where Bbr is a standard Brownian bridge. Let
#

τ = inf{u : S(u) = inf t∈[0,1] S(t)}. By elementary properties of Brownian bridges, almost
surely τ is the unique global minimum of S on [0, 1]. By (3.5)
d √
(V (S)(t); t ∈ [0, 1]) = ( 1 + α Bex (t); t ∈ [0, 1]).

Proposition 3.10. Let T ∼ Unif(Gn,m (0)) where n, m → ∞ and n/m → α ∈ R+ . Then


Z 1
1 (d) √
√ Wn,m =⇒ 1 + α Bex (t) dt.
m n 0

Moreover, for any k fixed,


"Z
1 k #
1  k 
k k/2
E Wn,m → (1 + α)k/2 E Bex (t) dt .
m n 0

We begin with some algebraic manipulations. Note from the definition of Wn,m in (2.3)
n−1  m−1    
X m  X n  m n n m
Wn,m = Xn (j) − j +
#
Xn (j) − j + + − m(n − 1)
j=0
n j=0
m n 2 m 2
n−1  m−1
X m  X n  m−n
= Xn# (j) − j + Xn (j) − j + .
j=0
n j=0
m 2

11
Since Xn# (n) = m and Xn (m) = n − 1, we have
n  m
X m  X n  m−n+2
Wn,m = Xn (j) − j +
#
Xn (j) − j + . (3.8)
j=0
n j=0
m 2

We now relate these two summations above to the processes Yn# , Yn given An,m . To do
this, we will use Lemma 3.9 and the following lemma, which is a direct consequence of (3.7).
Lemma 3.11. Suppose that gn (k) = kj=1 yj for k = 0, 1, · · · , n for some yj ∈ R. Then for
P
i ∈ {0, 1, · · · , n − 1}
n
X n
X
θn (gn , i)(j) = ign (n) − ng(i) + gn (j).
j=0 j=0

We now prove the following lemma.


Lemma 3.12. Let Yn# , Yn , Sn be as in Lemma 2.2 and let An,m be defined as in (3.1). Let
τn = min{k : Sn (k) = minj≤n Sn (j)}. Then
n
X m  m 
(Yn (j) − j) − n Yn (τn ) − τn
# #

j=0
n n
m
X n  n # 
+ (Yn (j) − j) − m Yn ◦ Yn (τn ) − Yn (τn ) − Yn# (τn )
#
(3.9)
j=0
m m
!
m−n+2 d
+ |An,m = Wn,m .
2

Proof. This is just a combination of Lemmas 3.9 and 3.11. Indeed,


n n
X m d X m 
(Xn# (j) − j) = θn (Yn# − Id, τn )(j) An,m .
j=0
n j=0
n

Moreover, almost surely on the event An,m we have Yn# (n) = m and so an application of
Lemma 3.11 gives (a.s. on An,m )
n  n
X m   m X # m 
θn (Yn − Id, τn )(j) = τn (Yn# (n) − m) − n Yn# (τn ) − τn
#
Yn (j) − j
j=0
n n j=0
n
n 
X m   m 
= Yn# (j) − j − n Yn# (τn ) − τn .
j=0
n n

This gives the first term in (3.9). The second term is obtained similarly. The result follows
from (3.8).
Now, in order to establish scaling limits for Wn,m we just need to establish conditional
scaling limits for each of the terms appear in (3.9). The next lemma handles this.

12
Lemma 3.13. Maintain the notation as Lemma 3.12. Then jointly the following conver-
gences hold conditionally given An,m
n Z 1 n Z 1
1 X # m (d) √ # 1 X n (d)
√ (Y (j) − j)=⇒ αBbr (t) dt, √ (Y (j) − j)=⇒ Bbr (t) dt,
m n j=0 n n 0 m n j=0 n m 0

n # m (d) √ 1  n  (d)
(Yn (τn ) − τn )=⇒ αBbr #
(τ ), √ Yn ◦ Y # (τn ) − Yn# (τn ) =⇒Bbr (τ ).
m n n m
p
Moreover, all the prelimits above are uniformly bounded in L (Ω, P(−|An,m )) for all p ≥ 1.
Proof. The uniform bound in Lp follows Corollary 3.3.
# #
Let us write Y n (t) = m−1 Yn# (nt) and Y n (t) = m−1 Yn (t) as we did in the proof of Corol-
lary 3.5. Throughout this proof, we work conditionally given An,m . By applying Proposition
R1
3.1 and the continuity of f 7→ 0 f (s) ds we see
n Z n+1 
1 X # m 1 m 
√ (Y (j) − j) = √ Yn (⌊s⌋) − ⌊s⌋ ds
#

m n j=0 n n m n 0 n
Z 1+n−1 
√ n 1+o(1) √
 r  
⌊nt⌋ ⌊nt⌋
Z
# #
= n Y n (t) − dt = m Y n (t) − dt
0 n m 0 n
Z 1
(d) √
=⇒ α Bbr
#
(t) dt.
0

1
Pm n (d) R 1
Similarly, m√ n j=1 (Yn (j) − m j)=⇒ 0 Bbr (t) dt.
Also, by Corollary 3.5 and Lemma 3.6 we see that τn /n → τ = inf{t : S(t) = inf u∈[0,1] S(u)}
jointly with the convergence in Proposition 3.1. By Proposition 2.1 in [9, Chapter VI], if
tn → t, fn → f in the J1 topology, and f (t) = f (t−) then fn (tn ) → f (t). Therefore,
√ 
n m  √  # τn  (d) √ #
Yn# (τn ) − τn = n Y n (τn /n) − =⇒ αBbr (τ )
m n n
√ # (d) √
where we used the previous observation with tn = τn /n and fn = n(Y n (t)−t)=⇒ αBbr (t).
The other term is analogous. Indeed,
1  n  √  # #

√ Yn ◦ Yn# (τn ) − Yn# (τn ) = n Y n ◦ Y n (τn /n) − Y n (τn /n) .
n m
# (d)
By Proposition 3.1 Y n (t)=⇒t locally uniformly in t and, in combination with [4, pg 146],
we have in the J1 topology
√ # # (d)
n(Y n ◦ Y n − Y n )=⇒Bbr .
The stated claim now easily follows.
Proof of Proposition 3.10. Using Lemma 3.12, we have
 n √
1 d 1 X # m n # m
√ Wn,m = √ (Yn (j) − j) + (Yn (τn ) − τn )
m n m n j=0 n m n
n
!
1 X n 1  n 
+ √ (Yn (j) − j) + √ Yn ◦ Y # (τn ) − Yn# (τn ) + o(1) An,m .
m n j=0 m n m

13
By Lemma 3.13,
1 √ √ #
1
Z
(d) 
√ Wn,m =⇒ αBbr
#
(t) + Bbr (t) dt − αBbr (τ ) − Bbr (τ )
m n 0
Z 1 Z 1
d √
= (S(t) − S(τ )) dt = 1 + α Bex (t) dt.
0 0

The convergence of moments follows easily from the uniform bound in Lp in Lemma 3.13
and, for example, Theorem 3.5 and equation (3.18) in [4].

3.5 Proof of Theorem 1.1


The proof of Theorem 1.1 is now straight-forward. Propositions 2.5 and 3.10 imply
  "Z
1 k #
#Gn,m (k) Wn,m 1
=E ∼ (1 + α)k/2 E Bex (t) dt .
#Gn,m (0) k k! 0

This is the desired result.

References
[1] L. Addario-Berry, N. Broutin, and C. Goldschmidt, The continuum limit of critical ran-
dom graphs, Probab. Theory Related Fields 152 (2012), no. 3-4, 367–406. MR 2892951

[2] Osvaldo Angtuncio and Gerónimo Uribe Bravo, On the profile of trees with a given
degree sequence, arXiv e-prints (2020), arXiv:2008.12242.

[3] Jean Bertoin, Eternal additive coalescents and certain bridges with exchangeable incre-
ments, Annals of probability (2001), 344–360.

[4] Patrick Billingsley, Convergence of probability measures, second ed., Wiley Series in
Probability and Statistics: Probability and Statistics, John Wiley & Sons, Inc., New
York, 1999, A Wiley-Interscience Publication. MR 1700749

[5] David Clancy, Jr., Near-critical bipartite configuration models and their associated in-
tersection graphs, arXiv e-prints (2024), arXiv:2410.11975.

[6] Tuan Anh Do, Joshua Erde, Mihyun Kang, and Michael Missethan, Component be-
haviour and excess of random bipartite graphs near the critical point, Electron. J. Com-
bin. 30 (2023), no. 3, Paper No. 3.7, 53. MR 4614541

[7] Lorenzo Federico, Critical scaling limits of the random intersection graph, arXiv preprint
arXiv:1910.13227 (2019).

[8] Taro Hasui, Tomoyuki Shirai, and Satoshi Yabuoku, Enumeration of connected bipartite
graphs with given betti number, arXiv preprint arXiv:2208.03996 (2022).

14
[9] Jean Jacod and Albert Shiryaev, Limit theorems for stochastic processes, vol. 288,
Springer Science & Business Media, 2013.
[10] Svante Janson, Brownian excursion area, Wright’s constants in graph enumeration, and
other Brownian areas, Probab. Surv. 4 (2007), 80–145. MR 2318402
[11] Götz Kersting, On the Height Profile of a Conditioned Galton-Watson Tree, arXiv e-
prints (2011), arXiv:1101.3656.
[12] Jean-François Le Gall, Random trees and applications, Probab. Surv. 2 (2005), 245–311.
MR 2203728
[13] Jean-François Marckert, One more approach to the convergence of the empirical process
to the Brownian bridge, Electron. J. Stat. 2 (2008), 118–126. MR 2386089
[14] Soumik Pal, Brownian approximation to counting graphs, SIAM J. Discrete Math. 26
(2012), no. 3, 1181–1188. MR 3022133
[15] J. Pitman, Combinatorial stochastic processes, Lecture Notes in Mathematics, vol. 1875,
Springer-Verlag, Berlin, 2006, Lectures from the 32nd Summer School on Probability
Theory held in Saint-Flour, July 7–24, 2002, With a foreword by Jean Picard. MR
2245368
[16] Hubert Ian Scoins, The number of trees with nodes of alternate parity, Mathematical
Proceedings of the Cambridge Philosophical Society, vol. 58, Cambridge University
Press, 1962, pp. 12–16.
[17] Joel Spencer, Enumerating graphs and brownian motion, Communications on Pure and
Applied Mathematics: A Journal Issued by the Courant Institute of Mathematical
Sciences 50 (1997), no. 3, 291–294.
[18] Wim Vervaat, Functional central limit theorems for processes with positive drift and
their inverses, Z. Wahrscheinlichkeitstheorie und Verw. Gebiete 23 (1972), 245–253.
MR 321164
[19] , A relation between Brownian bridge and Brownian excursion, Ann. Probab. 7
(1979), no. 1, 143–149. MR 515820
[20] Minmin Wang, Large random intersection graphs inside the critical window and triangle
counts, arXiv preprint arXiv:2309.13694 (2023).
[21] Ward Whitt, Some useful functions for functional limit theorems, Mathematics of op-
erations research 5 (1980), no. 1, 67–85.
[22] , Stochastic-process limits, Springer Series in Operations Research, Springer-
Verlag, New York, 2002, An introduction to stochastic-process limits and their applica-
tion to queues. MR 1876437
[23] E. M. Wright, The number of connected sparsely edged graphs, J. Graph Theory 1 (1977),
no. 4, 317–330. MR 463026

15

You might also like