6761 4 MarkovChains
6761 4 MarkovChains
6761 4 MarkovChains
Markov Chains
1. Introduction
2. Chapman-Kolmogorov Equations
3. Types of States
4. Limiting Probabilities
5. Gambler’s Ruin
6. First Passage Times
7. Branching Processes
8. Time-Reversibility
1
4. Markov Chains
4.1. Introduction
3
4. Markov Chains
4
4. Markov Chains
5
4. Markov Chains
7
4. Markov Chains
8
4. Markov Chains
10
4. Markov Chains
0: Xi−1 = R, Xi = R
1: Xi−1 = S, Xi = R
2: Xi−1 = R, Xi = S
3: Xi−1 = S, Xi = S
11
4. Markov Chains
12
4. Markov Chains
13
4. Markov Chains
Pi,i+1 = p, i = 1, 2, . . . , N − 1
Pi,i−1 = 1 − p, i = 1, 2, . . . , N − 1
P0,0 = PN,N = 1
16
4. Markov Chains
17
4. Markov Chains
Proof: By definition,
(n+m)
Pij
= Pr(Xn+m = j|X0 = i)
∞
X
= Pr(Xn+m = j ∩ Xn = k|X0 = i) (total prob)
k=0
∞
X
= Pr(Xn+m = j|X0 = i ∩ Xn = k)Pr(Xn = k|X0 = i)
k=0
(since Pr(A ∩ C|B) = Pr(A|B ∩ C)Pr(C|B))
∞
X
= Pr(Xn+m = j|Xn = k)Pr(Xn = k|X0 = i)
k=0
(Markov property). ♦
18
4. Markov Chains
(4)
so that P00 = 0.5749. ♦
20
4. Markov Chains
Unconditional Probabilities
αi ≡ Pr(X0 = i), i = 0, 1, . . . .
∞
X
Pr(Xn = j) = Pr(Xn = j ∩ X0 = i)
i=0
∞
X
= Pr(Xn = j|X0 = i)Pr(X0 = i)
i=0
∞
X (n)
= Pij αi.
i=0
21
4. Markov Chains
22
4. Markov Chains
(n)
Definition: If Pij > 0 for some n ≥ 0, state j is
accessible from i.
Notation: i → j.
Notation: i ↔ j.
23
4. Markov Chains
27
4. Markov Chains
We have f0 = f1 = 1, f2 = 1/4, f3 = 1. ♦
28
4. Markov Chains
P∞ (n)
Theorem: i is recurrent iff n=1 Pii = ∞. (So i is
P∞ (n)
transient iff n=1 Pii < ∞.)
∞ ∞
X (n) X
Pii = Pr(Xn = i|X0 = i)
n=1 n=1
∞
X
= E[An|X0 = i] (trick)
n=1
" ∞ #
X
= E AnX0 =i
n=1
= E[N |X0 = i] (N = number of returns)
= ∞
⇔ i is recur (by previous theorem). ♦
32
4. Markov Chains
36
4. Markov Chains
Example: Consider
1/4 0 0 3/4
1 0 0 0
P =
.
0 1 0 0
0 0 1 0
Loop: 0 → 3 → 2 → 1 → 0. Thus, all states commu-
nicate; so they’re all recurrent. ♦
37
4. Markov Chains
Example: Consider
1/4 0 3/4 0 0
0 1/2 0 1/2 0
P =
1/2 0 1/2 0 0 .
0 1/2 0 1/2 0
1/5 1/5 0 0 3/5
38
4. Markov Chains
Pi,i+1 = p
Pi,i−1 = q = 1 − p
39
4. Markov Chains
41
4. Markov Chains
42
4. Markov Chains
44
4. Markov Chains
45
4. Markov Chains
(n)
Definition: Suppose that Pii = 0 whenever n is not
divisible by d, and suppose that d is the largest integer
with this property. Then state i has period d. Think
of d as the greatest common divisor of all n values for
(n)
which Pii > 0.
46
4. Markov Chains
Example:
0 1 0 0
1 0 0 0
P =
.
1/4 1/4 1/4 1/4
0 0 1/2 1/2
Here, states 0 and 1 have period 2, while states 2 and
3 are aperiodic. ♦
47
4. Markov Chains
21 , 23 , 18
n o
and π0 + π1 + π2 = 1. Get π = 62 62 62 . ♦
51
4. Markov Chains
53
4. Markov Chains
54
4. Markov Chains
Further, P00 = 1 = PN N .
55
4. Markov Chains
56
4. Markov Chains
Since p + q = 1, we have
iff
iff
q
Pi+1 − Pi = (Pi − Pi−1), i = 1, 2, . . . , N − 1.
p
57
4. Markov Chains
Since P0 = 0, we have
q
P2 − P1 = P1
p
2
q q
P3 − P2 = (P2 − P1) = P1
p p
..
i−1
q q
Pi − Pi−1 = (Pi−1 − Pi−2) = P1 .
p p
Summing up the LHS terms and the RHS terms,
j
i
X i−1
X q
(Pj − Pj−1) = Pi − P1 = P1.
j=2 j=1 p
58
4. Markov Chains
59
4. Markov Chains
Thus,
1−(q/p)
if p 6= 1/2
1−(q/p)N
P1 =
,
1/N if p = 1/2
so that
1−(q/p)i
if p 6= 1/2
1−(q/p)N
Pi =
. ♦
i/N if p = 1/2
By the way, as N → ∞,
1 − (q/p)i if p > 1/2
Pi →
. ♦
0 if p ≤ 1/2
60
4. Markov Chains
61
4. Markov Chains
(n)
Pij ≡ P (Xn = j|X0 = i)
(n)
fij ≡ P (Xn = j|X0 = i, Xk 6= j, k = 0, 1, . . . , n − 1).
Remarks:
(1) (1)
(1) By definition, fij = Pij = Pij
(n)
Pij = Prob. of going from i to j in n steps
(k)
fij = Prob. of i to j for first time in k steps
(n−k)
Pjj = Prob. of j to j in remaining n − k steps
63
4. Markov Chains
65
4. Markov Chains
(n) (n)
Same procedure as before but divide each f0N , f0N 0
by the probs. of being trapped. So probs. of first
passage times to N , N 0 in n steps are
(n) (n)
f0N f0N 0
P∞ (k)
and P∞ (k)
.
f
k=1 0N f
k=1 0N 0
66
4. Markov Chains
67
4. Markov Chains
Remarks:
(1) 0 is recurrent since P00 = 1.
(2) If P0 > 0, then all other states are transient.
(Proof: If P0 > 0, then Pi0 = P0i > 0. If i is recurrent,
we’d eventually go to state 0. Contradiction.)
69
4. Markov Chains
P∞
Denote µ ≡ j=0 jPj , the mean number of offspring
of a particular individual.
P∞
Denote σ 2 ≡ j=0 (j − µ) 2P , the variance.
j
E[Xn] = µn.
71
4. Markov Chains
Similarly,
n
µ −1
2
σ µn−1 if µ 6= 1
µ−1 ,
Var(Xn) =
nσ 2,
if µ = 1
72
4. Markov Chains
Fact: If µ = 1, then π0 = 1.
73
4. Markov Chains
j
where π0 implies that the families started by the j
members of the first generation all die out (indep’ly).
74
4. Markov Chains
Summary:
∞
X j
π0 = π0Pj (∗)
j=0
For µ > 1, π0 is the smallest positive number satisfy-
ing (*).
75
4. Markov Chains
Example: Suppose P0 = 1 ,
4 1 P = 1, P = 1.
4 2 2
∞
X 1 1 1 5
µ = jPj = 0 · + 1 · + 2 · = > 1
j=0 4 4 2 4
⇔ 2π02 − 3π0 + 1 = 0
76