Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
67 views

B 2 Stochasticprocesses 2020

1. The document presents problems related to stochastic processes and Markov chains. It covers key concepts like absorption states, hitting times, closed sets, martingales, and stationary distributions. The problems involve showing various properties and relationships for different Markov chains defined on state spaces like counting numbers, boxes of balls, gamblers' capital, and random walks.

Uploaded by

ARISINA BANERJEE
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
67 views

B 2 Stochasticprocesses 2020

1. The document presents problems related to stochastic processes and Markov chains. It covers key concepts like absorption states, hitting times, closed sets, martingales, and stationary distributions. The problems involve showing various properties and relationships for different Markov chains defined on state spaces like counting numbers, boxes of balls, gamblers' capital, and random walks.

Uploaded by

ARISINA BANERJEE
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

BII Introduction to Stochastic Processes

Problem Set 1

1. For a Markov chain show that P (X0 = x0 , X1 = x1 , X2 = x2 ) = P (X0 = x0 )P (X1 = x1 |X0 = x0 )P (X2 =
x2 |X1 = x1 ). (This is important for thinking about the probability of a given path.)
P
2. Using the notation pij (n1 + n2 ) = P (Xn1 +n2 = j|X0 = i) show that pij (n1 + n2 ) = k∈S pik (n1 )pkj (n2 ).
(Hint: Use the previous problem to group the terms with the same Xn1 .)
3. The above equation is called the Chapman Kolmogorov equation. Show that it is same as the the matrix
relation P n1 +n2 = P n1 P n2 .
4. For a Markov chain Xn , n ≥ 0, show that P (X0 = x0 |X1 = x1 , . . . , Xn = xn ) = P (X0 = x0 |X1 = x1 ).

5. Ehrenfest chain: We have two boxes 1 and 2 and d balls labelled 1, 2, . . . , d. Initially some of these balls
are in box 1 and some are in box 2. An integer is selected at random (WR) from 1, 2, . . . , d and the
ball labelled by that integer is transferred from its box to the other one. This procedure is repeated.
Let Xn denote the number of balls in box 1 after the nth trial. Show that (a) Xn , n ≥ 0 is a MC on
S = {0, 1, 2, . . . , d}, (b) the transition function is given by

 x/d, y = x − 1,
P (x, y) = 1 − x/d, y = x + 1,
0 otherwise.

6. Show that in the Ehrenfest chain if X0 has the distribution


d

x
P (X0 = x) = , x = 0, 1, 2, . . . , d,
2d
then X1 has the same distribution. This is called a stationary distribution for a given transition matrix.
7. Gambler’s ruin: A gambler starts out with initial capital d units and makes a series of 1 unit bets. He
has respective probabilities p and q = 1 − p of winning and losing each bet and if his capital reaches 0 his
capital remains d from then on, and similarly for d. Show that this is a Markov chain in which 0, d are
absorbing states and transition probability for x 6= 0, d is given by

 q, y = x − 1,
P (x, y) = p, y = x + 1,
0 otherwise.

Write down the transition matrix.


8. Random walk with reflecting barrier: When the random walk reaches 0, it remains there with probability
p or moves to 1 with probability q = 1 − p. Similarly when the random walk reaches d, it remains there
with probability p or moves to d − 1 with probability q = 1 − p. Write down the transition matrix.
9. The following involves thinking about Markov chain paths. In a finite state space MC suppose xi , xj are
(n)
transient. Then Pij ≤ bαn for some constant b and 0 < α < 1.
Proof: Let k be the minimum number of steps required to reach some ergodic sets from all possible
transient states. and let β = max transient xi P ( not reaching any ergodic set starting from xi in k steps).
(m)
If m = nk, pij ≤ β n = (β 1/k )kn = αn (think about transient to transient in k steps). If m = kn + r,
(m)
then Pij ≤ β n × a1 · a2 · · · ar = αnk a1 · a2 · · · ar = αm (a1 · a2 · · · ar /αr ) ≤ αm · b if we maximize
P P P
over all possible choices of transition probabilities a1 , . . . , ar and over 0 < r ≤ k − 1.

1
WLLN for chains as proved by Markov: The following is taken from Kemeny and Snell, Finite Markov
Chains, Page 74. This proof proceeds directly without assuming SLLN under first moment.
Suppose {Xi } is an irreducible aperiodic Markov chain with state space S = {α1 , α2 , . . . , αr } consisting of
real numbers. In that case the result is that
Pn
Xi X
E( 1 − αs πs )2 → 0.
n
S
P
Since Xi = S Xi 1{Xi =αs } , writing Yi = Xi 1{Xi =αs } it is enough to show
Pn n
Yi 1 X
E( 1
− αs πs ) = 2 E{ (Yi − αs πs )}2 → 0,
2
n n 1
Pn
for each s, since S is finite. Now expanding the second bracket the sum 1 E(Yi − αs πs )2 is bounded by
const.n and this divided by n2 goes to 0. In the consideration of the cross terms Markov’s ideas are used fully.
Consider, here and before multiplying by P (X0 = αt ) as necessary,
n−1
X n
X
2P (X0 = αt ) {EYi Yj − πs αs EYi − πs αs EYj + πs2 αs2 }
i=1 j=i+1
n−1
X n
X
= 2P (X0 = αt ) {αs2 P i (αt , αs )P j−i (αs , αs ) − πs αs2 P i (αt , αs ) − πs αs2 P j (αt , αs ) + πs2 αs2 }.
i=1 j=i+1

At this point, the inequalities mk−1 ≤ mk ≤ Mk ≤ Mk−1 and Mk − mk ≤ const.(1 − 2)k imply that P k (αt , αs )
for any t (as well as its limit πs ) lies in the middle of the first inequality and P k (αt , αs ) = πs + O(rk ) for some
0 < r < 1. Putting these consider the double sum only (omitting αs2 )
n−1
X n
X
{(πs + O(ri ))(πs + O(rj−i )) − πs (πs + O(ri )) − πs (πs + O(rj )) + πs2 }
i=1 j=i+1
n−1
X n
X
= πs {O(ri ) + O(rj−i ) + O(ri )O(rj−i ) + O(rj )}.
i=1 j=i+1

Since 0 < r < 1, for each summand one outer sum contributes at most n, the other outer sum contributes at
most 1/(1 − r) making the sum bounded by const.n/(1 − r). This divided by n2 goes to zero, completing the
proof of WLLN. Since the chain is dependent the main work lies in managing the covariance terms, and no
short cut is available. These results are later generalized to ‘dependence under mixing conditions’.
A similar but more precise computation is used in the proof of CLT for finite Markov chains (this proof is
probably due to Doeblin).
Remarks: For P n (αt , αs ) = πs + O(rn ), recall that first raising P to a sufficiently large power N we proved
MkN − mkN ≤ (1 − 20 )k . Take r = (1 − 20 )1/N , so that for n = kN, Mn − mn ≤ rn . For n = kN + s, 0 ≤ s < N
we proved Mn − mn ≤ (1 − 20 )k = rkN +s × ( 1r )s ≤ rn × ( 1r )N = brn . Hence in general |P n (αt , αs ) − πs | ≤ b.rn .
You can search google for Eugene Seneta, Markov and the creation of Markov chains, for Markov’s contraction
inequality and its application to the ‘google matrix’, presumably used in google search.

2
BII Introduction to Stochastic Processes
Problem Set 2

1. If a is an absorbing state then show that P n (x, a) = P (Ta ≤ n).


P P
2. Show that (a) Px (Ty = n + 1) = z6=y P (x, z)Pz (Ty = n) (b) Px (Ty ≤P
n + 1) = P (x, y) + z6=y P (x, z)
Pz (Ty ≤ n) (c) defining ρxy = Px (Ty < ∞) show that ρxy = P (x, y) + z6=y P (x, z)ρzy .

3. Consider an MC on S = {0, 1, 2, . . .} with p(x, x + 1) = p, p(x, 0) = 1 − p. (a) Show that the chain is
irreducible. (b) Find P0 (T0 = n), n ≥ 1. (c) Show that the chain is recurrent.
/ C, or equivalently if P n (x, y) = 0 for all x ∈ C, y ∈
4. C is a closed set if ρxy = 0 for all x ∈ C, y ∈ / C. Show
/ C implies P n (x, y) = 0 for all x ∈ C, y ∈
that P (x, y) = 0 for all x ∈ C, y ∈ / C for all n ≥ 1.

5. An MC on S = {0, 1, . . . , d} satisfies
d
X
yP (x, y) = x, x = 0, 1, . . . , d.
y=0

Such a condition can be written as E(Xn+1 |Xn = x) = x and a chain satisfying such a condition is said to
be a martingale. Show that (a) 0 and d must be absorbing states (b) if the Markov chain has no absorbing
states other than 0 and d then the states 1, 2, . . . , d − 1 each leads to zero and hence each is a transient
state.
6. Consider an MC on S = {0, 1, 2, . . .} with p(x, x + 1) = px , p(x, 0) = 1 − px , x = 0, 1, 2, . . .. (a) Show that
P (T0 = n) = un−1 − un where uP 0 = 1, un = (1 − po ) · · · (1 − pn−1 ), n ≥ 1. (b) Conclude that 0 is recurrent

iff lim un = 0. (c) Show that if 1 pi < ∞ then.every state is transient.
7. Consider an MC on S = {0, 1, . . . , 6} with transition matrix
 
1/2 0 1/8 1/4 1/8 0 0
 0 0 1 0 0 0 0 
 
 0 0 0 1 0 0 0 
 
 0 1 0 0 0 0 0 .
 
 0 0 0 0 1/2 0 1/2 
 
 0 0 0 0 1/2 1/2 0 
0 0 0 0 0 1/2 1/2

Determine (a) the transient and recurrent states and (b) find ρ0y , y = 0, . . . 6.
8. Consider an MC on S = {0, 1, . . . , 5} with transition matrix
 
1/2 1/2 0 0 0 0
 1/3 2/3 0 0 0 0 
 
 0 0 1/8 0 7/8 0 
 .
 1/4 1/4 0 0 1/4 1/4 
 
 0 0 3/4 0 1/4 0 
0 1/5 0 1/5 1/5 2/5

Determine (a) the transient and recurrent states and (b) find ρ{0,1} (x), x = 0, . . . , 5.

9. In the previous problem the closed sets are C1 = {0, 1}, C2 = {2, 4}. Are the other states transient?
Recall that ρc (x) = Px (TC < ∞). What is ρC1 (4)? Find ρC1 (3), ρC2 (3), ρC1 (5)ρC2 (5). Verify that
ρC1 (3) + ρC2 (3) = 1, ρC1 (5) + ρC2 (5) = 1. Do you think there should be any relation between ρC1 (3) and
ρC1 (5)?

3
Gambler’s ruin on finite and infinite state spaces:
Suppose S = {0, 1, 2, . . . , N } and the transition probabilities are as follows: P (0, 0) = r0 , P (0, 1) = p0 =
1 − r0 , P (x, x − 1) = qx , P (x, x) = rx , P (x, x + 1) = px , qx + rx + px = 1, qx , px > 0, 1 ≤ x ≤ N − 1, P (N, N − 1) =
qn , P (N, N ) = rN = 1 − qN .
We want to find the probabilities that Px (T0 < TN ), 1 ≤ x ≤ N −1. Thus using the notation h(x) = Px (T0 <
Tn ), we get the recursions (on the LHS can multiply by qx + rx + px = 1 to cancel rx h(x) from both sides)

h(1)= q1 + r1 h(1) + p1 h(2),


q1
or, h(1) − h(2) = (1 − h(1)),
p1
h(2) = q2 h(1) + r2 h(2) + p2 h(3),
q2
or, h(2) − h(3) = (h(1) − h(2)),
p2
··· = ··· ,
h(N − 1) = qN −2 h(N − 2) + rN −1 h(N − 1),
qN −1
or, h(N − 1) = (h(N − 2) − h(N − 1)).
pN −1

Successively using the even lines we get (the first line below is extra)

1 − h(1) 1 − h(1),
=
q1
h(1) − h(2) = (1 − h(1)),
p1
q1 q2
h(2) − h(3) = (1 − h(1)),
p1 p2
··· = ··· ,
q1 q2 · · · qN −1
h(N − 1) = (1 − h(1)).
p1 p2 · · · pN −1

Adding all the above inequalities we get 1 = (1 + q1 /p1 + · · · )(1 − h(1)), or


1
1 − h(1) = q1 q1 q2 ···qN −1 .
1+ p1 + ··· + p1 p2 ···pN −1

Adding the same inequalities successively from the first line we get
q1 q1 q2 · · · qx−1
1 − h(x) = (1 + + ··· + )(1 − h(1)), 2 ≤ x ≤ N − 1,
p1 p1 p2 · · · px−1

giving
q1 q2 ···qx q1 q2 ···qN −1
p1 p2 ...px + · · · + p1 p2 ···pN −1
h(x) = , 1 ≤ x ≤ N − 1.
1 + pq11 + · · · + pq11pq22···q N −1
···pN −1

When state space in infinite, in the formula for 1 − h(1) one can make N ↑ ∞ and examine the cases when
q1 q1 q2 · · · qn−1
1+ + ··· +
p1 p1 p2 · · · pn−1

converges or diverges as n ↑ ∞ leading to h(1) < 1 or h(1) = 1, respectively. These are used in the tran-
sience/recurrence of birth and death chains with the useful reminder that h(1) = P1 (T0 < T∞ ) = P1 (T0 < ∞)
there.
Later, to use martingale techniques in this problem, 0 and N can be made absorbing and then
the new techniques give the same formulas. The fact is, the paths contributing to the required
events depend only on the states 1, 2, . . . , N − 1, thus changing the end probabilities does not alter
the required probabilities.

4
BII Introduction to Stochastic Processes
Problem Set 3

n
P∞ P∞
1. (a) Let fij = Pi (Tj = n), Fii (s) = n=0 fiin sn , Pii (s) = n=0 Piin sn with the notational assumption that
Pii0 = 1, fii0 = 0. Show that Fii (s)Pii (s) = Pii (s) − 1 for |s| < 1, and hence Pii (s) = 1/(1 − Fii (s)).
(b) Consider a random walk on the integers with Pi,i+1 = p, Pi,i−1 = q = 1 − p, 0 < p < 1. Show that
= 2m 2m+1
2m
 m
P00 m p q m , P00 = 0.
(c) Find P00 (x) by using 2n = (−1)n −1/2 2 where na = a.(a − 1) . . . (a − n + 1)/n!. (Answer
  2n 
n n
1 − 4pqx2 )−1/2 .)
(d) Find F00 (x) and find conditions on p for recurrence or transience by making x → 1. Which result
from power series are you using here? Is the chain irreducible?
2. Two indistinguishable fair coins are tossed independently, simultaneously and repeatedly. Let En be the
event that at the nth toss the cumulative number of heads on both tallies are equal. Can you relate the
recurrence times of En to the recurrence times of a symmetric random walk on Z (or a subset of Z)? No
further calculations are needed.
 
1−a a
3. Show that if P = , 0 < a, b < 1, then
b 1−b
(1 − a − b)n
   
n 1 b a a −a
P = + .
a+b b a a+b −b b
What does this say about the convergence of P n ? Find the left eigenvector π to P corresponding to
eigenvalue 1, satisfying π0 + π1 = 1, and show geometrically that for any initial distribution q, q − π is
proportional to the other left eigenvector of P . Use this to prove that qP n → π (this provides another
way of doing it).
4. A gambler playing roulette makes a series of 1 dollar bets. He has probability 9/19 and 10/19 of winning
and losing respectively. He decides to quit as soon as he is either 1 dollar ahead or has lost his capital. (a)
Find the probability that when he quits he has lost his capital of 1000 dollars. Note that in this problem
he quits when he reaches either a = 0 or b = 1001. (b) What is his expected return?
5. Consider a birth and death chain on {0, 1, 2, . . .} defined by px = (x+2)/2(x+1) and qx = x/2(x+1), x ≥ 0.
(a) Compute Px (Ta < Tb ) for a < x < b. (b) Compute ρx0 , x > 0. (Here a = 0, b = ∞.)
6. Consider
P∞ a birth and death chain on the nonnegative P x > 0, x ≥
integers so that px > 0, qP P1. (a) Show that
∞ ∞ ∞
if 0 γy = ∞ then ρx0 = 1, x ≥ 1. (b) Show that if 0 γy < ∞ then ρx0 = x γy / 0 γy , x ≥ 1.
7. Consider a gambler’s ruin chain on {0, 1, 2, . . .}, (a) Show that if q ≥ p, then ρx0 = 1, x ≥ 1. (b) Show
that if q < p then ρx0 = (q/p)x , x ≥ 1.
8. Consider an irreducible birth and death chain on the nonnegative integers. Show that if px ≤ qx for x ≥ 1
then the chain is recurrent.
9. Consider an irreducible birth and death chain on the nonnegative integers such that qx /px = (x/(x +
1))2 , x ≥ 1. Show that the chain is transient and find ρx0 , x ≥ 1.
10. In a population each male has 3 children, probability of a male child being 1/2. (a) Find the probability
of extinctin of the male line. (b) If a given man has two boys and one girl what is the probability that his
male line will continue forever?
11. Consider a branching chain such that f (1) < 1. Show that every state other than 0 is transient.
12. Consider a branching chain with f (x) = p(1 − p)x , x ≥ 0, where 0 < p < 1. Show that ρ = 1 if p ≥ 1/2
and ρ = p/(1 − p) if p < 1/2.
13. Let Xn , n ≥ 0 be a branching chain. (a) Show that Ex Xn = xµn . (b) If the associated random variable
has finite variance σ 2 then show that E{Xn+1
2
|Xn = x} = xσ 2 + x2 µ2 . (c) Show that Ex Xn+1 2
=
n 2 2 2
xµ σ + µ Ex Xn .

You might also like