Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Assignment 1 Answers

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

E1 222 Stochastic Models and Applications

Problem Sheet #1

1. A fair coin is tossed n times. What is the probability that the difference
between the number of heads and the number of tails is equal to n − 3?

Answer: If you have k heads, the difference between heads and tails is |k − (n −
k)| = |n − 2k|. Since k = 0, 1, · · · , the difference can only take values:
n, n − 2, · · · . Answer is zero.

2. A box contains coupons labelled 1, 2, 3, · · · , n. Two coupons are drawn


from the box with replacement. Let a, b denote the numbers on the
two coupons. Find the probability that one of a, b divides the other.

Answer: Take a number, say, a, less than n. The number of multiples of it


between 1 and n is ⌊ na ⌋ (that is the greatest integer smaller than or
equal to n/a). Here, we are counting a also as a multiple of a. Let

A = {(a, b) : 1 ≤ a, b ≤ n, a, b integers, b is a multiple of a}

Then |A| = nk=1 ⌊ nk ⌋. Suppose B is the set of (a, b) where now a is


P
a multiple of b. Cardinality of B is same as that of A. The event we
want is A ∪ B. WePknow A ∩ B is set of pairs (a, a). Hence, the needed
n
probability is n2 k=1 ⌊ k ⌋ − n1
2 n

3. A fair dice is rolled repeatedly till we get at least one 5 and one 6.
What is the probability that we need five rolls?

Answer: Let us calculate the probability that n rolls are needed. Let A be the
event that n rolls are needed and the last one is a 5 and let B be the
event that n rolls are needed and the last is a 6. These are mutually
exclusive and their union is what we want. Let Dk be the event of
getting exactly k 6’s out of n − 1 rolls with the remaining being any of
the other four numbers (other than 5 and 6). Then
 k  n−1−k
(n−1) 1 4
P (Dk ) = Ck
6 6

The events Dk for different k are mutually exclusive. Their union for
k = 1, · · · , n − 1, gives us the event that there are no 5’s and at least

1
one 6 in the first n − 1 rolls. This, along with a 5 on the next roll, gives
us the event A. Hence we have
n−1
1X
P (A) = P (Dk )
6 k=1
n−1  k  n−1−k
1 X (n−1) 1 4
= Ck
6 k=1 6 6
" n−1  k  n−1−k  n−1 #
1 X (n−1) 1 4 4
= Ck −
6 k=0 6 6 6
" n−1  n−1 #
1 1 4 4
= + −
6 6 6 6

(The above expression is reasonable. The first term in the square brack-
ets is the probability of not getting a 5 in the n − 1 rolls. Out of that
we should remove the possibility of getting only one of the other four
numbers on all n − 1 rolls).
It is easy to see that P (B) would also be the same. Hence the needed
probability is " 
n−1  n−1 #
1 5 4

3 6 6
Can we do some check for the answer? If we sum the above for n =
2, · · · , ∞, then we should get 1. (Note that we need at least two rolls
to get a 5 and a 6).

4. Suppose E and F are mutually exclusive events of a random experi-


ment. This random experiment is repeated till either E or F occurs.
Show that the probability that E occurs before F is P (E)/(P (E) +
P (F )).

Answer: We want to split the required event into union of mutually exclusive
events. Let Bn be the event: ”E occurs before F and n repetions are
needed”. These are mutually exclusive for different n. Union over all
n is the event we want. For Bn to occur E or F should not occur in
the first n − 1 repetitions and on the nth trial E should occur. It is

2
easy to see that P (Bn ) = (1 − P (E ∪ F ))n−1 P (E). Hence the needed
probability is

X P (E)
(1 − P (E ∪ F ))n−1 P (E) =
n=1
P (E) + P (F )

where we have used the fact P (E ∪ F ) = P (E) + P (F )

5. Suppose n men put all their hats together in a heap and then each man
selects a hat at random. Show that the probability that none of the n
men selects his own hat is
1 1 1 (−1)n
− + − ···
2! 3! 4! n!

Answer: Let Ei denote the event of ith man getting his own hat. Then P (Ei ) =
(n−1)!
n!
, for all i. This is because, after giving the ith man his hat,
we have (n − 1)! ways of arranging remaining ones. It is easy to see
that P (Ei ∩ Ej ) = (n−2)!
n!
because after giving i and j their hats, we
can arrange the remaining in (n − 2)! ways. By the same argument,
P (Ei ∩ Ej ∩ Ek ) = (n−3)!
n!
Now consider E = E1 ∪ · · · ∪ En . This denotes the event of at least
one of the men getting his own hat. Hence, the probability we want
is (1 − P (E)). We can calculate P (E) using the general formula for
unions.
X XX XXX
P (∪ni=1 Ei ) = P (Ei ) − P (Ei Ej ) + P (Ei Ej Ek ) − · · ·
i i j<i i j<i k<j

(n − 1)! (n − 2)! (n − 3)!


= n − n C2 + n
C3 − ···
n! n! n!
1 1
= 1 − + − ···
2! 3!
The last term in the above would involve P (E1 E2 · · · En ) which is 1/n!.
The probability we want is 1 − P (E).

6. Suppose there are three special dice, A, B, C which have the following
numbers on their six faces:

A: 1, 1, 6, 6, 8, 8

3
B: 2, 2, 4, 4, 9, 9
C: 3, 3, 5, 5, 7, 7

The dice are fair in the sense that each of the faces have the same
probability of coming up.
(i). Suppose we roll dice A and B. What is the probability that the
number that comes up on A is less than the one that comes up on B?
(ii)Suppose your friend, with whom you go out for dinner often, offers
you the following. At the end of each dinner, you choose any one of the
three dice that you want. She/He would then choose one of the two
dice that are remaining. Then both of you roll your respective dice.
Whoever gets the smaller number would pay for the dinner. Would you
take the offer?

Answer: (i). Number on A would be less than that on B if we get a 1 on A (and


anything on B) or if we get a 6 or 8 on A and a 9 on B. Hence this
probability is (1/3) + (2/3)(1/3) = 5/9.
(ii). For the second part what you should be worried about is whether
you would be paying for more than 50% of the dinners. Let [A < B]
denote the event that when we roll A and B, the number on A is less
that that on B, and similarly for others. Then it is easy to see that

P [A < B] = P [B < C] = P [C < A] = 5/9

Hence, if you choose A your friend can choose B, if you choose B she/he
can choose C and if you choose C, she/he can choose A to ensure that
you have more than 50% chance of paying for the dinner.

7. Consider a communication system. The transmitter sends one of two


waveforms. One waveform represents the symbol 0 and the other rep-
resents the symbol 1. Due to the noise in the channel, the receiver
cannot say with certainty what was sent. The receiver is designed so
that, after sensing signal coming out of the noisy channel, it puts out
one of the three symbols: a, b, c. The following statistical parameters
of the system are determined (either through modeling or experimen-
tation):
P [a|1] = 0.6, P [b|1] = 0.2, P [c|1] = 0.2
P [a|0] = 0.3, P [b|0] = 0.4, P [c|0] = 0.3

4
Here, p[a|0] denotes the probability of the receiver putting out symbol
a when the symbol transmitted is 0 and similarly for all others. The
transmitter sends the two symbols with probabilities: P [0] = 0.4 and
P [1] = 0.6. Find P [1|a] and P [0|a]. When receiver puts out a what
should we conclude about the symbol sent? We would like to build a
decision device that will observe the receiver output (that is, a, b, or
c) and decide whether a 0 was sent or a 1 was sent. An error occurs
if the decision device says 1 when a 0 was sent or vice versa. Find
a decision rule that minimizes the probability of error. What is the
resulting (minimum) probability of error?

Answer: By Bayes rule we have

P [a|1]P [1] 0.6 ∗ 0.6 3


P [1|a] = = =
P [a|1]P [1] + P [a|0]P [0] 0.6 ∗ 0.6 + 0.3 ∗ 0.4 4

and hence P [0|a] = 1/4. Similarly we can calculate that P [1|b] = 3/7
and P [1|c] = 1/2. Hence, intutively, if we observe a we should say a 1
is sent; if we observe b we should say a zero is sent and when we observe
c we can say either 0 or 1.
Let h : {a, b, c} → {0, 1} denote a decision function (decision device).
That means when we observe a we output h(a) as the bit received. We
want to calculate probability of error for this decision function. For a
decision function h, given we observe a, the probability of error is same
as probability of sent bit being complement of h(a) (given we observe
a). We have

P [error] = P [error|a]P [a] + P [error|b]P [b] + P [error|c]P [c]


= P [(h(a))c |a]P [a] + P [(h(b))c |b]P [b] + P [(h(c))c |c]P (c)

This is the probability of error for any decision function. We need


to choose h to reduce this. We note the following. In the first term
P [(h(a))c |a] would be either P [1|a] or P [0|a] based on our choice of h.
All we can do is to choose h to reduce this contribution to error. That
is, if P [0|a] < P [1|a] then we want (h(a))c to be 0. So, the optimal
decision function is:

1 if P [1|a] > P [0|a]
h (a) =

0 if P [1|a] ≤ P [0|a]

5
and similarly for h∗ (b) and h∗ (c). Thus, h∗ (a) = 1, h∗ (b) = 0 and
h∗ (c) = 0. The probability of error for this decision function is

P [error] = P [0|a]P [a] + P [1|b]P [b] + P [1|c]P [c]

We can calculate P [a] = P [a|1]P [1] + P [a|0]P [0] etc. This gives us
probability of error to be 0.36.
Note that the optimal decision function here is not unique. If we had
taken h∗ (c) = 1, then also we get the same probability of error.

8. At a telephone exchange, the probability of receiving k calls in a time


interval of two minutes is given by the function h(2, k). Assume that
the event of receiving k1 calls in a time interval I1 is independent of
the event of receiving k2 calls in a time interval I2 , for all k1 and k2
whenever the intervals I1 and I2 do not overlap. Find an expression
for the probability of receiving s calls in 4 minutes in terms of h(2, k).
Now suppose h(2, k) is given by

(2a)k e2a
h(2, k) = .
k!
Now find the probability of s calls in 4 minutes.

Answer: The event of getting k calls in a 4-minute interval can be written as


a mutually exclusive union of the events: ”getting s calls in the first
two minutes and k − s calls in the next two minutes” for s = 0, · · · , k.
We know events of calls in non-overlapping intervals are independent.
Hence
k
X
P [k calls in 4 min] = P [s calls in 1st 2 min and k − s calls in next 2 min]
s=0
Xk
= h(2, s)h(2, k − s)
s=0

6
Now, substituting for h(2, s)
k
X (2a)s e−2a (2a)k−s e−2a
P [k calls in 4 min] =
s=0
s! (k − s)!
k
k
X 1 k!
= e −4a
(2a)
s=0
s!(k − s)! k!
k
1 X
k k!
= e −4a
(2a)
k! s=0 s!(k − s)!
1 (4a)k e−4a
= e−4a (2a)k (1 + 1)k =
k! k!

9. There is a component manufacturing facility where 5% of the products


may be faulty. The factory wants to pack the components into boxes
so that it can guarantee that 99% of the boxes have at least 100 good
components. What is the minimum number of components they should
put into each box?

Answer: Suppose we put 100 + x components in each box. Let p denote the
probability of any component being bad. We have p = 0.05. Assuming
that different components being bad is independent, this is like having
100 + x independent tosses and wanting at most x tails (of a coin with
probability of tails being p).
x
X
(100+x)
P [at most x bad ones] = Ck pk (1 − p)100+x−k
k=0

We need to fix x so that the above is greater than or equal to 0.99.


This is computationally hard. Later we would see how to approximate
binomial distribution with Poisson or Normal so that we can compute
such things easily.

You might also like