Matrix Tree Theorem
Matrix Tree Theorem
Matrix Tree Theorem
• Last time:
The problem that we’ll discuss is counting the number of spanning trees in a graph.
We’ll actually need the weighted version, so for a graph where edges have weight we , the
weight of a tree is
def
Y
w(T ) = we ,
e∈T
For example the triangle graph with edge weights 1, 2, and 3 has weight
1 × 2 + 2 × 3 + 3 × 1 = 2 + 6 + 3 = 11.
Counting spanning trees is related to the generation of a random spanning trees, which
is in turn used in approximation algorithms. This connection between determinant and
combinatorial objects in turn has a variety of uses in the study of other combinatorial
objects.
We will show that the number of spanning trees can be computed using the determi-
nant of a matrix, which also underlies many other things in spectral graph theory:
1
Definition 0.1. Given an undirected graph G, define its graph Laplacian as a n × n
matrix L with entries given by:
P
x6=u wux if u = v
def
Luv = −wuv if u 6= v and uv ∈ E .
0 otherwise
The matrix-tree theorem states that the determinant of the first n − 1 rows/columns
of L gives the total weight of all trees.
Theorem 0.2 (Matrix-Tree Theorem). Let L1:n−1,1:n−1 be the minor containing the first
n − 1 rows and columns of L. We have:
T (G) = det(L1:n−1,1:n−1 )
For the graph above with edge weights 1, 2 and 3, the graph Laplacian is:
3 −1 −2
−1 4 −3 ,
−2 −3 5
This means we can find the determinant of the graph Laplacian by picking a ver-
tex, and eliminate it by adding/subtracting it to the non-zero off-diagonal entries in its
row/column. For the matrix
3 −1 −2
−1 4 −3 ,
−2 −3 5
2
we can remove the −1 in row 2, column 1 by adding 1/3 copies of the first row to the
second, giving:
3 −1 −2
0 3.66 −3.66 ,
−2 −3 5
and the −2 can be removed similarly by adding 2/3 copies of the first row to the second,
giving:
3 −1 −2
0 3.66 −3.66 .
0 −3.66 3.66
After we do this, we can remove the −1 and −2 in row 1 for free by adding copies of the
3 to it, leading to the matrix
3 0 0
0 3.66 −3.66 .
0 −3.66 3.66
Note that the bottom-right 2 × 2 matrix is still a graph Laplacian. For a formal proof
that graphs are closed under Schur complements, the following definition is useful:
Lemma 0.4. A matrix is a graph Laplacian if and only if:
• It’s symmetric.
This way of removing a vertex generates what’s called a Schur Complement. This has
a slightly messy algebraic definition, but can also be defined as the object formed by
removing vertices in this order.
Note that in the above example, the determinant of the new matrix is still
3 × 3.666666666666666 = 11.
3
This, plus the fact that we have the matrix-tree theorem in the n = 2 case suggests an
inductive approach: we repeatedly remove vertices, and show that the Schur complement
formed by removing that vertex has a tree count related to the tree count of the original
graph.
Lemma 0.5. Let Schur(G, u) be a graph obtained from G by pivoting out a vertex u with
weighted degree d(u) Then
This Lemma implies the matrix tree theorem because we can now do induction on the
value of n, and apply the matrix tree theorem on the smaller Schur(G, u).
To show this inductive step, it suffices to show a bijection between trees in Schur(G, u)
and trees in G. There are two kinds of edges in Schur(G, u):
• Those added from the clique formed by pivoting u, we’ll denote these with K.
As a sanity check, suppose there is only 1 connected component, then this tree is done
if we’re working on H. But in G, we still need to connect u to one of these vertices, and
there are d(u) ways of doing this.
In the remaining time (if there is any), we will give a proof sketch of this Lemma:
• Any forest will produce connected components, let it be S1 . . . St .
4
• For clique edges on the other hand, the total weight of edges between Si and Sj is:
w(Si ) × w(Sj )
d(u)
• This is a product demand graph. So by proofs like Prufer sequences, it can be shown
that the total weight of spanning trees there is
t
Y w(Si ) 1 Y
d(u)t−1 = w(Si ).
i=1
d(u) d(u) i