lecture_06
lecture_06
October 9, 2020
Warning: This material is not meant to be lecture notes. It only gathers the main concepts
and results from the lecture, without any additional explanation, motivation, examples, figures...
is called the eigenspace of A associated to λ. The dimension of Eλ (A) is called the multiplicity
of the eigenvalue λ.
Remark 1.1. Notice that Eλ (A) is a subspace of Rn : any (non-zero) linear combination of
eigenvectors associated with the eigenvalue λ is also an eigenvector of A associated with λ.
Remark 1.2. Definition 1.1 can be generalized to allow complex eigenvalues and eigenvectors:
λ ∈ C and v ∈ Cn . However in this course, we only consider real eigenvalues and eigenvec-
tors.
Example 1.1. For λ1 , . . . , λn ∈ R, we introduce the notation
0 0
λ1 ···
0 ..
def λ2 .
Diag(λ1 , . . . , λn ) = . ∈ Rn×n .
. .. ..
. . .
0 ··· ··· λn
0
0 0 ..
λ1 ··· ···
.
.. ..
.
0 .
0
. ..
..
Diag(λ1 , . . . , λn ) ei = . 1 = λi ei .
λi
0
.
..
.
. . 0 ..
0 ··· ··· 0 λn .
0
Hence the vector ei is an eigenvector of the matrix Diag(λ1 , . . . , λn ) associated with the eigenvalue
λi .
1
Proposition 1.1
Let A ∈ Rn×n . Suppose that A has an eigenvalue λ ∈ R and let x ∈ Rn be an eigenvector
associated to λ. The following holds:
Definition 1.2
The set of all eigenvalues of A is called the spectrum of A and denoted by Sp(A).
Proposition 1.2
Let v1 , . . . , vk be eigenvectors of A corresponding (respectively) to the eigenvalues λ1 , . . . , λk .
If the λi are all distinct (λi 6= λj for all i 6= j) then the vectors v1 , . . . , vk are linearly
independent.
P Xt+1 = y X0 = x0 , . . . , Xt = xt ) = P (xt , y)
y∈E
2
In order to simplify the notations, we will assume that E = {1, 2, . . . , n} and write for all
i, j ∈ E, Pi,j = P (j, i). Note that we switched here the order of i and j. This is not
what is usually done in the literature, but this will allow us to be more coherent
with our linear algebra framework. Such matrix is said to be stochastic:
Definition 2.2 (Stochastic matrix)
A matrix P ∈ Rn×n is said to be stochastic if:
Let (X0 , X1 , . . . ) be a Markov chain on {1, . . . , n} with transition matrix P . For t ≥ 0 we will
encode the distribution of Xt in the vector
(t)
x(t) = (x1 , . . . x(t)
n ) = P(Xt = 1), . . . , P(Xt = n) ∈ ∆n
Proposition 2.1
For all t ≥ 0
x(t+1) = P x(t) and consequently, x(t) = P t x(0) .
Corollary 2.1
Let P be a stochastic matrix. Then
• For all x ∈ ∆n , P x ∈ ∆n .
3
Theorem 2.1 (Perron-Frobenius, stochastic case)
Let P be a stochastic matrix such that there exists k ≥ 1 such that all the entries of P k are
strictly positive. Then the following holds:
Theorem 2.1 is proved in the next section. Theorem 2.1 tells us that there is a unique µ ∈ ∆n
such that P µ = µ. We call µ the Perron-Frobenius eigenvector of P .
Remark 2.1. There exist a stronger version of the Perron-Frobenius Theorem which does not
require the columns of P to sum to 1, see for instance Theorem 1.1 in [2]. The proof is however
more involved.
Corollary 2.2
Let P be a stochastic matrix such that there exists k ≥ 1 such that all the entries of P k are
strictly positive. Then there exists a unique invariant measure µ and for all initial condition
x(0) ∈ ∆n ,
x(t) = P t x(0) −−−→ µ.
t→∞
Corollary 2.2 tells us that the Markov chain “forgets” its initial condition to converge to its
invariant measure µ. We say that the chain is “mixing”.
Working a little bit more, one can prove the “ergodic” Theorem that states that µi corresponds
to the average time spent by the Markov chain in state i.
4
def
Proof. First notice that ϕ is well-defined by Corollary 2.1. Let us write α = mini,j Pi,j ∈ (0, 1).
Let x, y ∈ ∆n . We will show that kP x − P yk1 ≤ (1 − α)kx − yk1 , i.e. kP zk1 ≤ αkzk1 where
z = x − y. Compute
n n X
n
kP zk1 = (P z)i =
X X
Pi,j zj .
i=1 i=1 j=1
n X
n n X
n n
kP zk1 = (Pi,j − α/n)zj ≤ (Pi,j − α/n)|zj | = (1 − α)|zj | = (1 − α)kzk1 .
X X X
Let now µ be a minimizer of x 7→ kP x − xk1 on ∆n .
• Let us now prove (ii). Let x ∈ Rn such that P x = x. Then for all t ≥ 1:
x = P t x = x1 P t e 1 + · · · + xn P t e n (2)
−−−→ x1 µ + · · · + xn µ = (x1 + · · · + xn )µ ∈ Span(µ), (3)
t→∞
In the case k > 1 we simply apply the result for k = 1 to P k . This gives that there exists
a unique µ ∈ ∆n such that P k µ = µ. Multiplying by P on both sides leads to P k (P µ) = P µ.
Since P µ ∈ ∆n we obtain that P µ = µ by uniqueness of µ. This proves (i). To prove (ii) we
consider x ∈ Rn such that P x = x. By iteration we get P k x = x which implies (using the result
on P k ) that x ∈ Span(µ). To prove (iii) we fix ` ∈ {0, . . . , k − 1}. Let x ∈ ∆n . By applying the
point (iii) to P k , we have
P kt P ` x −−−→ µ.
t→∞
Since this holds for all ` ≤ k − 1 we obtain that P r x −−−→ µ using the Euclidean division of r
r→∞
by k.
« there exists k ≥ 1 such that all the entries of P k are strictly positive »
5
is needed in the Perron-Frobenius Theorem. Consider for instance the transition matrix
!
0 1
P = .
1 0
This matrix does not verify the condition above since we have for all k ≥ 1:
Id2 if k is even,
(
P =
k
P otherwise.
Eventhough P admits an invariant measure µ = (1/2, 1/2), we see that for x0 = (1, 0), P t x0 does
not converges to µ as t → ∞. Hence (iii) of Theorem 2.1 does not hold.
where deg(j) denotes the number of outgoing links on page j. The idea behind PageRank is to
quantify the importance of a page i by the fraction of time spent by the “drunk surfer” on it. By
Theorem 2.2 we know that this corresponds to the coefficient µi of the invariant measure µ of P .
The matrix P is however not guaranteed to satisfy the hypotheses of Theorem 2.2-Corollary 2.2.
Brin and Page proposed to use instead of P the matrix
1−α
G = αP + 1,
n
where α ∈ (0, 1) is a parameter close to 1 (Google takes α ' 0.85) and 1 denotes the all-one
matrix. The PageRank algorithm computes µ the Perron-Frobenius eigenvector of the matrix G
and ranked the webpages according to their coordinate in the vector µ: the higher µi , the better
page i will be ranked.
Federer, Nadal, Djokovic, Murray, Del Potro, Roddick, Coria, Zverev, Ferrer, Soderling, Tsonga,
Nishikori, Raonic, Nalbandian, Wawrinka, Berdych, Hewitt, Tsitsipas, Monfils, Gonzalez,
Thiem, Ljubicic, Davydenko, Cilic, Pouille, Safin, Isner, Dimitrov, Medvedev, Ferrero, Goffin,
Bautista Agut, Sock, Gasquet, Simon, Blake, Monaco, Coric, Stepanek, Khachanov, Almagro,
Robredo, Verdasco, Anderson, Youzhny, Baghdatis, Dolgopolov, Kohlschreiber, Fognini, Melzer,
Paire, Querrey, Tomic, Basilashvili.
6
To do so, we have access to the “head to head” record between them (see Figure 2) in the
form of the matrix R ∈ Rn×n :
We will use the approach of the previous section to rank the players. In our case, instead
of a “drunk surfer” we will consider a “drunk spectator”. At time t the value Xt ∈ {1, . . . , n}
indicates which tennis player the spectator believes to be the best. At time t + 1, the spectator
picks uniformly at random a game played by its favorite player Xt against one of the other players,
x. If the game was won by Xt , then the spectator still believes that Xt is the best: Xt+1 = Xt .
Otherwise the spectator changes his mind: Xt+1 = x.
This can be modeled by a Markov chain with transition matrix:
if i = j
(
Vj /Gj
Pi,j =
Ri,j /Gj otherwise,
where Vj denotes the total number of victories of player j and where Gj denotes the total number
of game played by j:
n n
Vj = and Gj = Ri,j + Rj,i .
X X
Rj,i
i=1 i=1
Federer (14.4%), Djokovic (13.7%), Nadal (13.6%), Murray (5.8%), Ferrer (3.0%), Del Potro
(2.8%), Berdych (2.5%), Roddick (2.4%), Wawrinka (2.3%), Tsonga (2.1%), Nishikori (1.6%),
Nalbandian (1.6%), Hewitt (1.5%), Monfils (1.5%), Davydenko (1.5%), Cilic (1.4%), Soderling
(1.4%), Verdasco (1.2%), Gonzalez (1.2%), Raonic (1.2%), Ljubicic (1.2%), Gasquet (1.2%),
Simon (1.1%), Thiem (1.1%), Isner (1.0%), Zverev (1.0%), Youzhny (1.0%), Robredo (0.9%),
Kohlschreiber (0.9%), Ferrero (0.9%), Stepanek (0.8%), Safin (0.8%), Dimitrov (0.8%), Almagro
(0.7%), Baghdatis (0.7%), Blake (0.7%), Anderson (0.7%), Goffin (0.7%), Coria (0.7%),
Bautista Agut (0.6%), Monaco (0.6%), Fognini (0.6%), Querrey (0.6%), Melzer (0.6%),
Dolgopolov (0.5%), Coric (0.5%), Pouille (0.4%), Tsitsipas (0.4%), Sock (0.4%), Paire (0.3%),
Medvedev (0.3%), Khachanov (0.3%), Tomic (0.2%), Basilashvili (0.1%).
References
[1] Lawrence Page, Sergey Brin, Rajeev Motwani, and Terry Winograd. The pagerank citation
ranking: Bringing order to the web. Technical report, Stanford InfoLab, 1999.
[2] Eugene Seneta. Non-negative matrices and Markov chains. Springer Science & Business
Media, 2006.
7
0 10 20 30 40 50 60 70 80 0 2 4 6 8 10 12 14
Federer Federer
Nadal Nadal
Djokovic Djokovic
Murray Murray
DelPotro DelPotro
Roddick Roddick
Coria Coria
Zverev Zverev
Ferrer Ferrer
Soderling Soderling
Tsonga Tsonga
Nishikori Nishikori
Raonic Raonic
Nalbandian Nalbandian
Wawrinka Wawrinka
Berdych Berdych
Hewitt Hewitt
Tsitsipas Tsitsipas
Monfils Monfils
Gonzalez Gonzalez
Thiem Thiem
Ljubicic Ljubicic
Davydenko Davydenko
Cilic Cilic
Pouille Pouille
Safin Safin
Isner Isner
Dimitrov Dimitrov
Medvedev Medvedev
Ferrero Ferrero
Goffin Goffin
BautistaAgut BautistaAgut
Sock Sock
Gasquet Gasquet
Simon Simon
Blake Blake
Monaco Monaco
Coric Coric
Stepanek Stepanek
Khachanov Khachanov
Almagro Almagro
Robredo Robredo
Verdasco Verdasco
Anderson Anderson
Youzhny Youzhny
Baghdatis Baghdatis
Dolgopolov Dolgopolov
Kohlschreiber Kohlschreiber
Fognini Fognini
Melzer Melzer
Paire Paire
Querrey Querrey
Tomic Tomic
Basilashvili Basilashvili
(a) Percentage of wins (b) PageRank
Figure 1: Comparison of the ranking by the percentage of wins (on the left) and
the ranking using PageRank.
8
Fe Na Dj Mu De Ro Co Zv Fe So Ts Ni Ra Na Wa Be He Ts Mo Go Th Lj Da Ci Po Sa Is Di Me Fe Go Ba So Ga Si Bl Mo Co St Kh Al Ro Ve An Yo Ba Do Ko Fo Me Pa Qu To Ba
Fe 0 15 22 14 18 21 3 3 17 16 11 7 11 11 23 20 18 1 10 12 2 13 19 9 1 10 6 7 3 10 7 8 4 18 7 10 4 4 14 1 5 11 7 6 17 7 5 14 4 4 7 3 4 1
Na 24 0 26 17 11 7 4 5 26 6 8 11 7 5 18 20 7 4 14 7 9 7 5 6 2 2 7 12 0 7 4 3 4 16 8 4 7 2 7 6 15 7 17 5 13 9 7 15 11 3 4 4 3 3
Dj 25 28 0 25 16 4 2 3 16 6 17 16 9 4 19 25 6 1 15 1 6 7 6 17 1 0 9 8 3 2 5 7 1 13 11 3 8 3 13 1 5 7 11 8 7 8 6 10 8 3 1 8 6 2
Mu 11 7 11 0 7 8 0 1 14 3 14 9 9 5 11 11 1 0 4 1 2 4 6 12 4 0 8 8 0 3 6 3 0 8 16 2 5 2 7 1 5 6 13 6 4 5 4 5 4 7 2 7 5 0
De 7 6 4 3 0 4 0 2 7 4 5 6 3 1 4 5 2 1 2 2 4 1 4 11 0 1 8 6 0 1 2 3 1 7 5 2 1 1 3 3 4 2 5 7 4 4 5 7 1 5 1 3 2 1
Ro 3 3 5 3 1 0 5 0 4 2 2 1 1 4 1 6 7 0 3 9 0 7 5 1 0 4 4 1 0 5 0 0 1 3 2 9 1 0 7 0 2 11 10 2 3 3 1 4 2 10 0 6 1 0
Co 0 1 2 0 0 0 0 0 4 1 0 0 0 2 0 2 0 0 0 5 0 3 2 0 0 1 0 0 0 3 0 0 0 1 0 1 2 0 0 0 1 3 2 0 6 0 0 0 0 2 0 0 0 0
Zv 3 0 2 0 0 0 0 0 5 0 1 2 1 0 2 2 0 1 0 0 2 0 0 6 0 0 5 2 4 0 2 4 1 4 4 0 1 1 0 2 1 0 1 4 3 1 1 2 3 0 1 0 0 1
Fe 0 6 5 6 6 7 1 3 0 4 3 4 4 9 7 8 3 0 3 5 1 6 2 4 1 1 7 5 0 7 2 3 2 10 8 1 5 0 8 0 15 8 14 3 4 4 10 11 11 7 3 3 4 1
So 1 2 1 2 1 4 0 0 10 0 5 1 0 1 2 7 2 0 3 5 0 2 7 2 0 0 1 0 0 1 0 0 0 2 5 2 4 0 5 0 5 5 5 0 5 3 0 1 3 2 0 4 0 0
Ts 6 4 6 2 2 1 0 2 1 0 0 3 2 1 3 5 4 1 4 1 2 3 4 2 2 1 2 4 1 3 4 2 3 5 9 3 6 1 3 2 6 1 3 3 3 7 3 11 4 6 3 4 3 1
Ni 3 2 2 2 2 0 0 1 10 0 6 0 5 0 4 5 0 1 4 0 3 0 2 9 1 0 2 5 2 1 3 4 1 3 1 3 1 1 1 2 2 3 5 5 2 1 5 3 2 3 7 6 3 1
Ra 3 2 0 3 2 0 0 2 0 0 5 2 0 1 3 6 1 0 3 0 2 0 1 1 3 0 1 2 0 0 3 5 8 3 5 2 1 1 3 0 2 6 4 1 3 3 2 2 2 1 0 4 5 0
Na 8 2 1 2 3 2 2 0 5 6 1 1 0 0 3 4 3 0 1 5 0 4 7 4 0 3 2 1 0 4 0 0 0 7 2 0 3 0 2 0 4 6 0 0 2 2 0 2 2 1 1 0 0 0
Wa 3 3 5 8 3 3 0 0 7 2 5 7 5 6 0 11 2 1 3 0 3 3 1 12 1 3 1 5 0 3 3 1 1 1 4 3 4 3 3 2 6 3 3 5 3 6 2 5 5 2 9 5 1 0
Be 6 4 3 6 4 5 1 4 8 3 8 1 3 1 5 0 3 0 6 3 2 2 3 6 1 1 7 3 0 2 2 4 2 9 8 3 7 3 3 0 9 7 11 12 6 4 4 9 3 5 4 5 5 0
He 9 4 1 0 3 7 2 0 1 3 0 2 1 3 2 0 0 0 2 2 0 1 4 1 0 6 4 1 0 6 0 0 1 2 0 8 1 0 3 0 1 1 0 1 5 3 0 2 0 7 0 3 0 0
Ts 1 1 1 0 0 0 0 2 0 0 0 0 0 0 0 0 0 0 2 0 2 0 0 0 1 0 0 0 0 0 4 1 0 1 0 0 0 1 0 2 0 1 1 2 0 0 0 2 2 0 0 0 0 1
Mo 4 2 0 2 0 5 0 3 3 0 4 1 3 3 3 1 2 1 0 2 0 3 2 4 2 4 7 4 1 0 2 3 1 10 2 3 2 2 6 0 3 2 3 5 1 4 4 14 4 4 0 3 2 1
Go 1 3 2 2 3 3 1 0 5 4 1 0 1 3 5 4 5 0 0 0 0 4 0 1 0 6 1 0 0 3 0 0 0 1 1 7 6 0 1 0 2 3 3 1 4 1 1 0 0 2 0 2 0 0
Th 4 4 3 1 0 0 0 5 1 0 0 2 1 0 1 0 0 3 5 0 0 0 0 1 0 0 1 2 2 0 3 1 3 0 9 0 1 3 1 1 2 0 0 2 2 0 2 1 3 1 2 3 2 0
Lj 3 2 2 3 1 4 2 0 1 3 3 2 0 5 3 3 0 0 4 4 0 0 4 2 0 2 0 0 0 3 0 0 0 0 3 4 2 0 1 0 3 5 3 0 8 2 1 3 0 0 2 2 0 0
Da 2 6 2 4 3 1 1 0 4 4 2 0 0 5 2 9 0 0 2 6 0 4 0 2 0 4 3 0 0 3 1 0 0 2 3 1 4 0 7 0 1 5 7 1 4 2 2 3 4 6 1 1 0 0
Ci 1 2 2 3 2 2 0 1 2 0 6 6 2 2 2 6 1 1 0 1 0 1 3 0 2 0 8 4 0 2 3 4 0 2 1 1 2 7 1 1 2 4 10 6 5 5 2 5 3 7 4 6 3 0
Po 0 1 0 1 1 0 0 0 2 0 2 0 1 0 0 0 0 0 0 0 2 0 0 0 0 0 0 0 4 0 3 2 0 4 1 0 1 2 0 1 1 0 1 0 0 1 0 2 2 0 3 0 1 0
Sa 2 0 2 1 0 3 1 0 1 2 0 0 0 6 1 2 7 0 0 3 0 2 4 1 0 0 1 0 0 6 0 0 0 4 0 2 0 0 1 0 0 4 1 0 3 1 0 0 2 1 0 1 0 0
Is 2 0 2 0 4 2 0 1 2 0 3 1 5 1 3 2 2 2 4 1 1 0 3 3 0 1 0 2 0 0 2 3 5 2 3 3 3 1 2 0 2 3 1 8 2 8 0 4 2 2 1 3 3 0
Di 0 1 1 3 2 0 0 1 1 0 0 1 3 0 4 3 0 0 1 0 2 0 1 2 2 0 1 0 1 0 7 3 2 3 4 0 1 0 1 0 2 2 4 6 1 8 4 2 4 2 1 3 2 1
Me 0 0 1 1 0 0 0 0 0 0 2 2 2 0 1 0 0 4 1 0 0 0 0 0 0 0 0 1 0 0 1 0 2 0 0 0 0 1 0 0 0 0 2 0 0 0 0 1 1 0 1 1 0 0
Fe 3 2 1 0 2 0 3 0 2 1 1 1 0 3 3 0 4 0 3 4 0 3 2 0 0 6 1 0 0 0 0 0 0 1 2 3 2 0 1 0 2 3 3 1 4 1 1 1 0 4 0 2 0 0
Go 1 1 1 0 3 0 0 0 0 0 3 0 2 0 2 1 0 2 3 0 7 0 0 3 1 0 1 1 0 0 0 3 3 1 3 0 0 4 2 4 3 0 3 1 0 2 1 2 0 2 3 2 1 2
Ba 0 0 3 1 2 0 0 2 1 0 3 1 0 0 1 3 0 0 1 0 3 0 0 2 3 0 1 1 1 0 2 0 2 2 1 0 0 4 1 3 1 1 1 1 2 2 2 2 3 1 6 3 1 2
So 0 0 0 0 1 0 0 2 2 0 0 2 3 0 0 0 0 0 0 0 1 0 0 3 1 0 3 3 0 0 0 2 0 4 1 1 0 1 1 1 1 0 2 2 0 1 1 1 1 1 1 1 2 0
Ga 2 0 1 3 1 2 0 0 3 3 4 7 1 0 2 8 0 1 7 0 2 2 6 2 1 2 3 5 1 0 1 1 0 0 8 2 0 1 1 0 4 2 7 7 6 1 3 2 2 3 6 3 8 0
Si 2 1 1 2 3 2 0 0 2 2 3 0 1 1 3 7 4 0 7 0 2 2 5 6 3 0 0 5 1 0 2 5 2 1 0 1 4 2 2 1 3 1 2 1 4 3 1 5 5 4 6 6 3 3
Bl 1 3 0 1 2 3 1 0 2 1 0 1 0 2 0 2 1 0 2 3 0 2 7 2 0 2 0 0 0 1 0 0 0 2 2 0 2 0 4 0 1 4 2 1 2 1 0 1 1 1 0 6 0 0
Mo 0 1 0 2 1 1 0 0 4 1 0 2 2 1 1 0 0 0 5 0 0 1 1 2 1 2 1 3 0 2 1 1 1 1 3 0 0 0 2 0 5 3 7 2 1 1 0 2 3 7 3 5 1 0
Co 2 2 0 2 1 0 0 3 0 0 0 0 0 0 1 1 0 0 0 0 1 0 0 0 2 0 1 0 3 0 0 3 2 1 0 0 0 0 0 1 1 1 1 1 1 1 1 2 0 0 2 1 1 3
St 2 0 1 2 3 1 0 0 3 2 1 0 0 1 4 1 1 0 3 4 0 2 4 3 0 2 3 0 0 3 0 0 0 3 2 3 5 0 0 0 4 2 4 2 4 3 2 6 2 0 1 1 3 0
Kh 0 0 1 0 1 0 0 1 2 0 0 2 0 0 0 2 0 0 0 0 1 0 0 0 2 0 4 0 1 0 1 2 0 0 2 0 0 0 0 0 0 1 0 0 0 1 0 1 0 0 0 2 0 2
Al 0 1 0 1 0 0 1 0 1 3 1 1 0 2 3 4 2 0 3 0 0 2 3 1 0 3 2 1 0 4 3 0 2 3 2 1 6 0 1 0 0 8 5 1 1 2 3 7 4 3 2 5 0 0
Ro 1 0 2 2 0 0 1 2 2 1 1 0 0 3 6 4 1 0 3 2 1 1 2 3 0 6 1 3 0 2 1 0 1 3 5 3 3 0 5 0 1 0 6 1 3 3 0 5 5 4 1 4 0 1
Ve 0 3 4 3 1 3 0 2 7 2 2 2 3 3 3 4 0 0 3 3 4 1 2 5 1 0 1 3 1 3 3 4 0 8 3 0 5 0 3 2 7 5 0 5 2 0 3 4 4 6 0 1 1 0
An 1 0 1 2 0 2 0 0 3 0 0 4 1 0 4 0 2 1 1 0 7 0 1 1 1 0 4 2 0 0 0 0 2 4 4 0 1 4 1 1 0 0 4 0 1 2 2 4 4 2 1 8 1 3
Yo 0 4 3 0 0 2 2 0 5 1 4 1 1 2 3 6 2 1 3 1 0 1 2 5 0 0 2 0 1 1 0 0 0 5 8 2 4 0 4 0 5 2 0 2 0 4 3 3 1 5 0 0 2 0
Ba 1 1 0 3 2 1 0 1 2 2 0 4 1 3 0 3 2 0 1 0 1 4 2 1 1 1 0 1 0 2 4 1 0 3 3 2 1 0 2 0 1 3 3 1 5 0 1 5 1 1 3 2 1 1
Do 0 2 0 0 0 0 0 0 4 1 3 1 1 1 2 2 0 0 0 1 1 0 1 1 0 0 0 1 1 1 1 0 1 0 3 2 1 2 1 2 2 1 0 2 1 4 0 1 5 1 0 2 6 0
Ko 0 1 2 1 2 2 0 3 3 4 1 0 1 1 0 2 2 0 2 0 2 0 2 7 1 0 4 0 1 3 1 3 1 2 5 2 2 2 2 3 3 3 6 0 7 3 3 0 7 4 4 2 1 1
Fo 0 4 0 3 1 0 0 1 0 0 2 1 0 0 1 2 2 0 4 1 1 0 1 1 2 1 0 2 1 0 1 7 1 2 0 0 2 2 2 0 3 5 3 1 3 3 1 2 0 0 1 1 2 0
Me 1 1 1 0 1 0 0 1 2 0 0 1 2 1 2 2 0 0 1 2 1 5 1 3 0 4 3 0 0 3 3 2 0 2 2 0 1 0 2 0 3 4 3 1 2 1 1 0 3 0 1 1 0 0
Pa 0 0 1 0 1 0 0 0 1 0 1 2 1 0 3 0 0 1 1 0 0 0 1 1 1 0 0 2 0 1 3 0 0 1 3 1 1 0 0 1 0 4 0 0 2 2 1 3 3 0 0 0 2 3
Qu 0 1 2 2 1 2 0 0 1 0 4 4 2 0 2 1 2 0 0 0 1 0 1 1 2 0 5 0 0 0 1 0 0 1 2 1 1 0 2 0 2 1 4 8 3 3 4 2 0 2 0 0 0 2
To 0 0 0 0 0 0 0 0 2 1 0 2 0 0 1 0 1 0 0 0 0 0 1 1 0 0 0 0 0 0 2 1 1 2 0 1 0 1 1 0 0 0 6 4 0 0 4 2 2 0 0 3 0 0
Ba 0 0 0 0 1 0 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 0 0 0 0 0 0 1 2 0 1 0 0 2 0 1 0 0 1 0 1 0 0 1 0 0 0 0 0 0