Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

1 84 General Random Variables Chap.

P R O B L E M S

SECTION 3. 1 . Continuous Random Variables and PDFs

Problem 1. Let X be uniformly distributed in the unit interval [0, 1] . Consider the
random variable Y = g(X ) , where

g(x) =
{ 21., if x ::; 1/3,
if x > 1 /3.

Find the expected value of Y by first deriving its PMF. Verify the result using the
expected value rule.

Problem 2. Laplace random variable. Let X have the PDF

fx ( x ) = '2.A e - >' I x l ,

where .A is a positive scalar. Verify that fx satisfies the normalization condition, and
evaluate the mean and variance of X .

Problem 3. * Show that the expected value of a discrete or continuous random vari­
able X satisfies

E[X] = J.x P(X > x) dx J.x P X ( < -x) dx .

Solution. Suppose that X is continuous. We then have

where for the second equality we have reversed the order of integration by writing the
set { (x. y) 1 0 ::; x < oc. x ::; y < oc } as { (x. y) 1 0 ::; x ::; y , 0 ::; y < oo } . Similarly. we
can show that
=
J. P(X < -x) dx - 1°= yfx (y) dy.
Problems 185

Combining the two relations above, we obtain the desired result.


If X is discrete, we have

P(X > x) = IX>o LPX(Y)y>x

= L (IV Px (y ) dx )
y>O 0

= L PX(Y) ( III dX)


y>O 0

and the rest of the argument is similar to the continuous case.


Problem 4. * Establish the validity of the expected value rule

E [g(X) ] = I: g(x)fx (x) dx,

where X is a continuous random variable with PDF fx .


Solution. Let us express the function 9 as the difference of two nonnegative functions,

where g+ (x) = max{ g(x) O } , and g - (x) = max{ -g(x) , O } . In particular. for any t 2::: 0,
,

we have g(x) > t if and only if g+ (x) > t.


We will use the result

E [g(X) ] = f.X> p (g(X) > t) dt - f.X> p (g(X) < -t ) dt


from the preceding problem. The first term in the right-hand side is equal to

1 00 f fx (X) dX dt = joo if fx (X) dt dx = j = g+ (x)fx (x) dx.


o i{x I g(x» t}
- 00
{ t l 0:$ t < g ( x ) } - X>

By a symmetrical argument, the second term in the right-hand side is given by

f.= p (g(X) < -t) dt = I: g- (x)fx (x) dx.

Combining the above equalities, we obtain

E [g(X) ] = I: g + (x)fx (x) dx - I: g- (x)fx (x) dx = I: g(x)fx (x) dx.


186 General Random Variables Chap. 3

SECTION 3.2. Cumulative Distribution Functions


Problem 5. Consider a triangle and a point chosen within the triangle according to
the uniform probability law. Let X be the distance from the point to the base of the
triangle. Given the height of the triangle, find the CDF and the PDF of X.

Problem 6. Calamity Jane goes t o the bank to make a withdrawal, and i s equally
likely to find 0 or 1 customers ahead of her. The service time of the customer ahead,
if present, is exponentially distributed with parameter >. . What is the CDF of Jane's
waiting time?

Problem 7. Alvin throws darts at a circular target of radius r and is equally likely
to hit any point in the target. Let X be the distance of Alvin's hit from the center.
(a) Find the PDF, the mean. and the variance of X.
(b) The target h as an inner circle o f radius t. I f X � t, Alvin gets a score of S = 1 / X.
Otherwise his score is S =O. Find the CDF of S. Is S a continuous random
variable?

Problem 8. Consider two continuous random variables Y and Z, and a random


variable X that is equal to Y with probability p and to Z with probability 1 p. -

(a) Show that the PDF of X is given by

fx (x ) = pfy (x) + (1 - p) fz (x ) .

(b) Calculate the CDF of the two-sided exponential random variable that h as PDF
given by
p>.eAX ,
fx (x) - ( 1 p)>.
_
{ i f x < 0,
e-AX • if x 2: 0,
_

where >. > 0 and 0 < p < 1.


Problem 9. * Mixed random variables. Probabilistic models sometimes involve
random variables that can be viewed as a mixture of a discrete random variable Y and
a continuous random variable Z. By this we mean that the value of X is obtained
according to the probability law of Y with a given probability p. and according to the
probability law of Z with the complementary probability 1 p. Then, X is called a
-

mixed random variable and its CDF is given. using the total probability theorem, by

Fx (x ) = P ( X � x )
= pP ( Y � x ) + (1 - p)P(Z � x )
= p Fy (x ) + ( 1 p)Fz(x).
-

Its expected value is defined in a way that conforms to the total expectation theorem:

E [X ] = pE[Y] + ( 1 - p)E[Z] .
The taxi stand and the bus stop near AI's home are in the same location. Al goes
there at a given time and if a taxi is waiting (this happens with probability 2/3) he
Problems 187

boards it. Otherwise he waits for a taxi or a bus to come, whichever comes first. The
next taxi will arrive in a time that is uniformly distributed between 0 and 1 0 minutes,
while the next bus will arrive in exactly 5 minutes. Find the CDF and the expected
value of AI 's waiting time.
Solution. Let A be the event that Al will find a taxi waiting or will be picked up by
the bus after 5 minutes. Note that the probability of boarding the next bus, given that
Al has to wait, is

Pea taxi will take more than 5 minutes to arrive) = �.


AI ' s waiting time, call i t X , is a mixed random variable. With probability
2 1 1 5
peA ) = -3 + -3 . -2 = -6 '
it is equal to its discrete component Y (corresponding to either finding a taxi waiting,
or boarding the bus), which has PMF

py (y) {�
3P A) '
if Y = 0,


6P A) '
if Y = 5,

12
{� if y = 0,
= 5)
if y = 5.
15 '
[This equation follows from the calculation

py (0) = P(Y = 0 I A) = P(Y = 0, A) - 2


peA ) 3P(A) '
The calculation for py (5) is similar.] With the complementary probability 1 - pe A),
the waiting time is equal t o its continuous component Z (corresponding t o boarding a
taxi after having to wait for some time less than 5 minutes), which has PDF
{ 1/5, if 0 :::; � :::; 5,
fz ( z ) = 0, otherwIse.

The CDF is given by Fx (x) = P(A)Fy (x) + (1 - P(A) ) Fz (x), from which
{ 0, if x < 0,
5 12 1 x
Fx (x) = + if O :::; x < 5,
5'
_ . _

6 15 6
_ . -

1, i f 5 :::; x .
The expected value of the waiting time is
5 3 . 1 5 15
E[X] = P(A) E[Y] + ( 1 - P(A)) E[ Z] = -6 . -
15
5 + - . - = -.
6 2 12
188 General Random Variables Chap . 3

Problem 10. * Simulating a continuous random variable. A computer has a


subroutine that can generate values of a random variable U that is uniformly distributed
in the interval [0, 1] . Such a subroutine can be used to generate values of a continuous
random variable with given CDF F(x) as follows. If U takes a value u, we let the value
of X be a number x that satisfies F(x) = u. For simplicity, we assume that the given
CDF is strictly increasing over the range S of values of interest. where S = {x I 0 <
F( x) < I } . This condition guarantees that for any u E (0. 1). there is a unique x that
satisfies F(x) u. =
( a) Show that the CDF of the random variable X thus generated is indeed equal to
the given CDF.
( b) Describe how this procedure can be used to simulate an exponential random
variable with parameter 'x.
( c) How can this procedure be generalized to simulate a discrete integer-valued ran­
dom variable?
Solution. ( a) By definition, the random variables X and U satisfy the relation F(X) =
U. Since F is strictly increasing, we have for every x ,
X�x if and only if F(X) � F(x).
Therefore,
P(X � x) P ( F(X) � F(x) ) = p ( U � F(x) ) = F(x) ,
=

where the last equality follows because U is uniform. Thus, X has the desired CDF.
( b ) The exponential CDF has the form F(x) = 1 - e - >'x for x � O. Thus, to generate
values of X. we should generate values u E (0. 1) of a uniformly distributed random
variable U. and set X to the value for which 1 - e - >' x = u, or x = - In (1 - u ) / ,x.
(c ) Let again F be the desired CDF. To any u E (0. 1 ) , there corresponds a unique
integer Xu such that F(xu - 1 ) < u � F(xu) . This correspondence defines a random
variable X as a function of the random variable U. We then have. for every integer k .
P(X = k) = P ( F(k - 1) < U � F(k) ) = F(k) - F(k - 1).
Therefore, the CDF of X i s equal t o F, as desired.

SECTION 3.3. Normal Random Variables


Problem 1 1 . Let X and Y be normal random variables with means 0 and 1 , respec­
tively, and variances 1 and 4, respectively.
( a) Find P(X �
1 .5) and P(X � - 1 ) .
(b) Find the PDF of ( Y - 1 )/2.
(c) Find P(-1 :$ Y :$ 1 ) .

Problem 1 2 . Let X be a normal random variable with zero mean and standard
deviation u. Use the normal tables to compute the probabilities of the events {X � ku}
and { I XI � ku} for k = 1 , 2, 3.
Problems 189

Problem 13. A city's temperature is modeled as a normal random variable with mean
and standard deviation both equal to 10 degrees Celsius. What is the probability that
the temperature at a randomly chosen time will be less than or equal to 59 degrees
Fahrenheit?

Problem 14. * Show that the normal PDF satisfies the normalization property. Hint:
2
The integral J�x e - x / 2 dx is equal to the square root of

1 =x 1 00 e -x 2 / 2 e - y2 / 2 dx dy,
- - 00

and the latter integral can b e evaluated by transforming to polar coordinates.


Solution. We note that

=
21T"
-
1 1271" 1 = e
-r
2 / 2 r dr dO
0 0

= 1,
the fifth equality, we use the change of variables u = r 2 / Thus, we have 2
where for the third equality, we use a transformation into polar coordinates, and for
.

1= _1_ e -x2 / 2 dx -- 1
..j'i;
,

00

because the integral is positive. Using the change of variables u = (x - J-l)/u, it follows
that
x fx (x) dx = X 1 e - (x - /L)2 /2172 dx = l OG 1 e -u2 /2 du
I - :x; I = v'2ii u
-
= ..j'i; _
__
= 1.

SECTION 3.4. Joint PDFs of Multiple Random Variables


Problem 15. A point is chosen at random ( according to a uniform PDF) within a
semicircle of the form { (x, y) I x2 + y2 :::; r 2 , y � O } , for some given r > O.
(a) Find the joint PDF of the coordinates X and Y of the chosen point.
190 General Random Variables Chap. 3

( b ) Find the marginal PDF of Y and use it to find E[Y] .


( c ) Check your answer in ( b ) by computing E[Y] directly without using the marginal
PDF of Y.

Problem 16. Consider the following variant of Buffon ' s needle problem ( Example
3. 1 1 ) ' which was investigated by Laplace. A needle of length I is dropped on a plane
surface that is partitioned in rectangles by horizontal lines that are a apart and vertical
lines that are b apart. Suppose that the needle's length I satisfies 1 < a and 1 < b. What
is the expected number of rectangle sides crossed by the needle? What is the probability
that the needle will cross at least one side of some rectangle?

Problem 17. * Estimating an expected value by simulation using samples of


another random variable. Let Y1 Yn be independent random variables drawn
• • • • ,

from a common and known PDF fy. Let S be the set of all possible values of Yi ,
S = { y I fy (y) > O}. Let X be a random variable with known PDF fx , such that
fx(y) = 0, for all y ¢ S . Consider the random variable

Show that
E[Z] = E[X] .
Solution. We have

E [Yi fx(Yi ) ] = r fx (y) fy (y) dy = r yfx(y) dy =


fY ( Yi) J Y fy (y) J E[X ] .
s s
Thus,
E[Z] = .!. � E
n L-
t=1
[Yi fx(Yi)
fY ] = ( Yi )
.!. � E[X]
n L-i=1
= E[X] .
SECTION 3.5. Conditioning
Problem 18. Let X be a random variable with PDF

fx (x) { X0,/4,
=
x
i f 1 < ::; 3,
otherwise,

and let A be the event {X � 2 } .


( a) Find E[X] ,
P ( A ) , fXIA (X ) , and E[X I A] .
( b ) Let Y X 2
= Find E[Y] and var ( Y ) .
.

Problem 19. The random variable X has the PDF

fx( x { cx- 2 ,
) =
0,
if 1 ::; � ::; 2 ,
otherwIse.
Problems 191

(a) Determine the value of c .


(b) Let A be the event {X > 1 . 5 } . Calculate P(A) and the conditional PDF of X
given that A has occurred.
(c) Let Y X 2 . Calculate the conditional expectation and the conditional variance
=
of Y given A.

Problem 20. An absent-minded professor schedules two student appointments for the
same time. The appointment durations are independent and exponentially distributed
with mean thirty minutes. The first student arrives on time, but the second student
arrives five minutes late. What is the expected value of the time between the arrival
of the first student and the departure of the second student?

Problem 2 1 . We start with a stick o f length e. We break i t at a point which i s chosen


according to a uniform distribution and keep the piece , of length Y, that contains the
left end of the stick. We then repeat the same process on the piece that we were left
with, and let X be the length of the remaining piece after breaking for the second time.
(a) Find the joint PDF of Y and X.
(b) Find the marginal PDF of X.
(c) Use the PDF of X to evaluate E[X].
(d) Evaluate E[X] , by exploiting the relation X = y . (X/ Y) .

Problem 22. We have a stick o f unit length, and we consider breaking i t i n three
pieces using one of the following three methods.
(i) We choose randomly and independently two points on the stick using a uniform
PDF, and we break the stick at these two points.
( ii) We break the stick at a random point chosen by using a uniform PDF, and then
we break the piece that contains the right end of the stick. at a random point
chosen by using a uniform PDF.
(iii) We break the stick at a random point chosen by using a uniform PDF, and then
we break the larger of the two pieces at a random point chosen by using a uniform
PDF.
For each of the methods (i) , (ii), and (iii) , what is the probability that the three pieces
we are left with can form a triangle?

Problem 23. Let the random variables X and Y have a joint PDF which is uniform
over the triangle with vertices at (0, 0) , (0, 1 ) , and ( 1 . 0).
(a) Find the joint PDF of X and Y .
( b ) Find the marginal P D F o f Y .
(c) Find the conditional PDF of X given Y .
(d) Find E [X I Y = yj , and use the total expectation theorem t o find E [X] i n terms
of E [Y] .
(e) Use the symmetry of the problem to find the value of E[X].
192 General Random Variables Chap. 3

Problem 24. Let X and Y be two random variables that are uniformly distributed
over the triangle formed by the points (0, 0), (1. 0) , and (0, 2) (this is an asymmetric
version of the PDF in the previous problem) . Calculate E [X] and E[Y] by following
the same steps as in the previous problem.

Problem 25. The coordinates X and Y of a point are independent zero mean normal
random variables with common variance (7 2 . Given that the point is at a distance of
at least c from the origin. find the conditional joint PDF of X and Y.

Problem 26. * Let X I , . . . . Xn be independent random variables. Show that

(
var TI ,=1 n X) n
' (
var ( X, ) )
TI �=1 E[X; ]
= g E [Xl l + 1 - 1.

Solution. We have

var (IT X, ) = E [IT X; ] - IT (E[Xi J) 2


,=1 i=1 i=1
n
11

= II E [X;]
i= 1
- II,=1 (E[X,J) 2
n
II (var(X, ) + ( E[X,J ) 2 ) II ( E[Xi J) 2 .
- i=1
n

=
1=1
The desired result follows by dividing both sides by

,=1

Problem 21. * Conditioning multiple random variables on events. Let X


and Y be continuous random variables with joint PDF fx . y , let A be a subset of the
two-dimensional plane, and let C = {(X, Y) E A}. Assume that P(C) > 0, and define

fx .Ylc (x, y) =
{ fx.YP(C)(x, y)
, if (x , y ) E A,
0, otherwise.

(a) Show that fX.Y lc is a legitimate joint PDF.


(b) Consider a partition of the two-dimensional plane into disjoint subsets Ai , i =
1, . . . , n , let Ci = {(X. Y ) E Ad, and assume that P(Ci ) > 0 for all i. Derive
the following version of the total probability theorem
n
fx , y (x , y ) = L P( C ) fX.Y ICi (x , y) .
i =1
Problems 193

Problem 28. * Consider the following two-sided exponential PDF

fX ( x) =
{ p).. e -AX ,
_
if x � 0,
( 1 p ) ..e AX , if x < 0,
where ).. and p are scalars with )" > ° and p E [0, 1]. Find the mean and the variance
of X in two ways:
(a) By straightforward calculation of the associated expected values.
(b) By using a divide-and-conquer strategy, and the mean and variance of the (one­
sided) exponential random variable.
Solution. (a)
E[X] = f: xfx (x) dx
= 1° x ( 1 - p) .. eAX dx + 1 00 xp)..e -AX dx
00
__
-

I_-_p + � °
)..
=

)..
- 2p - 1
)..
E[X 2 ] = f: x2 fx ( x) dx

= 1° x2 ( 1 - p) ..eAX dx + 100 x2p)..e - AX dx


00 0

= 22 - p) 2p
-

(1
).. 2 + ).. 2
).. 2 '

and
var(X) =
2 - (2P - 1 ) 2
).. 2 )..
(b) Let A be the event {X � O}, and note that P(A) p. Conditioned on A, the
=
random variable X has a (one-sided) exponential distribution with parameter ).. . Also,
conditioned on AC• the random variable -X has the same one-sided exponential dis­
tribution. Thus,
E[X I A] = >:1 '
and

It follows that
E[X] = P(A)E[X I A] + P(AC)E[X l AC]
-p).. - --1 -p
)..
2p - l
)..
194 General Random Variables Chap. 3

E[X 2 ] = P(A)E[X 2 1 AI + P(AC)E[X 2 1 AC]


2p 2 ( 1 - p)
= .;\2 + .;\ 2
2
- .;\2 '
and
var(X) = .;\2 -
2 (2P -
.;\
1) 2

Problem 29. * Let X, Y. and Z be three random variables with joint PDF /x.y.z .
Show the multiplication rule:

/x.Y.z (x, Y, z ) = /x l Y,z (x I y , z ) /Yl z (Y I z ) /z ( z ) .

Solution. We have, using the definition of conditional density,

- /x ,Y, z (x, y , z )
/x I y,Z ( x I Y, z ) -
/Y.Z ( Y , Z ) ,

and
/Y, z (y, z ) = /Y IZ ( Y I z ) /z ( z ) .
Combining these two relations, we obtain the multiplication rule.
Problem 30. * The Beta PDF. The beta PDF with parameters Q > 0 and {3 > 0
has the form

/x (x ) =
{ 1
B( o: , ,8)
x
o-1 1
( _
X) i3- 1 , if 0 :5 x :5 1 ,
0, otherwise.
The normalizing constant is

and is known as the Beta function.


(a) Show that for any m > 0, the mth moment of X is given by

E[X 7n] = B( o: + m, (3) .


B( o: , ,8)

(b) Assume that 0: and (3 are integer. Show that

( 0: - 1 ) ' ((3 I ) !
B( o: ,. (3 ) =
(0: + (3 - 1 ) !

so that
m

0: 0: + (3 +
( 1)··
E[X 1 = ( + (3)(0: 0: + 1 ). . 0:. . + -
( m 1)
( 0: + (3 + m - 1 )
.
Problems 195

(Recall here the convention that O! = 1.)


Solution. ( a) We have

E[X m ] =
1
B ( o , )3 )0
11 xm x0 - 1 ( 1 _
13 1
x) -
B o + m )3
dX = (B o. )3 , ) ·
( )

(b) In the special case where 0 = =


1 or )3 1 , we can carry out the straightforward
integration in the definition of B ( o, )3 ) , and verify the result. We will now deal with
the general case. Let Y, Yl , . . . , Yo+13 be independent random variables, uniformly dis­
tributed over the interval [0, 1] , and let A be the event

Then,
P(A) =
1
( 0 + )3 + 1 ) ! '
because all ways of ordering these 0 + )3 + 1 random variables are equally likely.
Consider the following two events:

B = { max { Y1 , . . . , Yo } � Y},

We have, using the total probability theorem,

p(B n C) =
/.1 p(B n C I Y = y) Jy (y) dy
= /. p ( max { Y1 , . . . , Yo } � y � min { Yo+1 , . . . , Yo+13 } ) dy
1

= /.1 p ( max { E . . . . , Yo } y) p l y min { Yo + J , . . . ' yo+, } ) dy


:-; :-;

= /. ' yO ( 1 y) " dy.


_

We also have
p(A I B n C) = 0!11'! '
because given the events B and C, all o! possible orderings of Y1 , , Yo are equally
likely, and all f'! possible orderings of Yo+! , . . . , Yo +13 are equally likely.
• • •

By writing the equation


P(A) = P(B n C) P(A I B n C)
in terms of the preceding relations, we finally obtain


(" + + I )
' = " ,1m /.' yO ( 1 - y)" dy ,
196 General Random Variables Chap. 3

or
11o
0:.' {3 '.
Y ( 1 - y ) dy = ( 0: + (3 + I ) ! .
0< f3

This equation can be written as

B(o: + 1. {3 + 1) = ( 0: +o:!(3B!+ 1 ) ". for all integer 0: > 0 , (3 > O.

Problem 31. * Estimating an expected value by simulation. Let Ix ( x) be a


PDF such that for some nonnegative scalars a, b, and we have Ix ( x) = 0 for all x c,

outside the interval [a, b]. and xix (x) � for all x. Let Yr , i = 1 , . . . , n, be independent
c

random variables with values generated as follows: a point (Vi, Wt) is chosen at random
(according to a uniform PDF) within the rectangle whose corners are ( a, 0) , ( b, 0) . (a, ) c ,
and (b, ) and if W1 � Vi lx ( Vi ) the value of Yi is set to 1 , and otherwise it is set to O.
c , .

Consider the random variable


Z Y1 + . . + Yn .
.
=
n
Show that
E[Z] = (bE [-Xa)
c
]

and
var ( Z) :::; 4n1 '
In particular, we have var(Z) --+ 0 as n --+ 00 .

Solution. We have

1) = P ( Wt Vi Ix (Vi))
P ( Yi =

- lb 1V1X(V)
-
0

c
1
( b - a)
dw dv

l' vtx (v) dv


a

_ .:::.
a_ ..!!:.

c(b - a)
___

E [X ]
- c(b - a) '
The random variable Z has mean P ( Yi = 1) and variance
P Yr = 1) ( 1 - P ( Yi = 1))
var ( Z ) = ( n
.

Since 0 :::; ( 1 - 2p) 2 = 1 - 4p( 1 - p), we have p( 1 - p) � 1/4 for any p in [0 , 1] ' so it
follows that var(Z) :::; 1/ (4n) .
Problem 32. * Let X and Y be continuous random variables with joint PDF Ix ,Y .
Suppose that for any subsets A and B of the real line, the events { X E A} and {Y E B}
are independent. Show that the random variables X and Y are independent.
Problems 197

Solution. For any two real numbers x and y , using the independence of the events
{X � x} and {Y � y } , we have

Fx.y (x, y) = P ( X � x, Y � y) = P ( X � x ) P(Y � y) = Fx (x)Fy (y) .

Taking derivatives of both sides , we obtain

82 Fx . y 8Fx 8Fy
fx.y (x, y) = 8x8y (x , y) = 8x (x) 8y (y) = fx (x) fy (y) ,

which establishes that X and Y are independent.


Problem 33. * The sum of a random number of random variables. You visit
a random number N of stores and in the ith store , you spend a random amount of
money Xi. Let
T = Xl + X2 + . . . + XN
be the total amount of money that you spend . We assume that N is a positive integer
random variable with a given PMF, and that the Xl are random variables with the
same mean E[X] and variance var(X). Furthermore, we assume that N and all the Xi
are independent. Show that

E[T] = E[X] E[N], and var(T) = var(X) E[N] + (E[X] ) 2 var(N).

Solution. We have for all i,

E[T I N = i] = iE[X],

since conditional on N = i, you will visit exactly i stores, and you will spend an
expected amount of money E[X] in each.
We now apply the total expectation theorem . We have
00
E[T] = L P ( N = i) E[T I N = i]
i =l
oc

= L P ( N = i)iE[X]
i=l
= E[X] L iP ( N = i)
1=1
= E[X] E[N].
198 General Random Variables Chap. 3

Similarly, using also the independence of the Xi, which implies that E[XiXj] = (E [X J) 2
if i =I- j , the second moment of T is calculated as
00
E[T2 ] = L P ( N = i ) E[T2 1 N = i]
,=1
00

i=1
00
= L P ( N = i) (iE [X 2] + i (i - 1) (E [XJ) 2 )
t=1 00 00
= E[X 2 ] L iP(N = i) + ( E[X J ) 2 L i(i - l ) P ( N = i)
i=1 t=1

= E [X 2 ] E [ N] + ( E [X J) 2 (E [ N2 ] - E [ NJ)
= var(X) E[N] + (E [XJ) 2 E[N2 ] .
The variance is then obtained by
var(T) = E [T2 ] (E [TJ) 2
_

= var(X) E[N] + (E[X J) 2 E [ N2 ] - (E[XJ) 2 (E[Nl ) 2


2
= var( X) E[N] + (E[X J)\ E [ N2 ] - (E [ N J ) ) ,
so finally
var(T) = var(X) E[N] + (E[X l ) 2 var(N).
Note: The formulas for E[T] and var(T) will also be obtained in Chapter 4 , using a
more abstract approach .
SECTION 3.6. The Continuous Bayes' Rule
Problem 34. A defective coin minting machine produces coins whose probability of
heads is a random variable P with PDF
jp (p) = { peP , pE [O, � ],
0, otherwise.
A coin produced by this machine is selected and tossed repeatedly, with successive
tosses assumed independent.
(a) Find the probability that a coin toss results in heads.
(b) Given that a coin toss resulted in heads, find the conditional PDF of P.
(c) Given that the first coin toss resulted in heads, find the conditional probability
of heads on the next toss.
Problem 35. * Let X and Y be independent continuous random variables with PDFs
j and jy , respectively, and let Z = X + Y.
x
Problems 199

(a) Show that fZ l x ( z I x) = fy ( z - x) . Hint: Write an expression for the conditional


CDF of Z given X, and differentiate.
(b) Assume that X and Y are exponentially distributed with parameter A. Find the
conditional PDF of X, given that Z = z .
(c ) Assume that X and Y are normal random variables with mean zero and variances
(7; and (7� , respectively. Find the conditional PDF of X, given that Z = z.
Solution. (a) We have

P( Z :::; z I X = x ) = P(X + Y :::; z I X = x )


= P(x + Y :::; z I X = x )
= P(x + Y :::; z )
= P(Y :::; z - x) ,
where the third equality follows from the independence of X and Y. By differentiating
both sides with respect to z, the result follows.
( b ) We have, for 0 :::; x :::; z,
fZlx (z I x )fx ( x ) fy ( z - x ) fx (x ) Ae - >' ( z-x ) Ae - >'x A 2 e - >'z
fX I Z ( x I z ) - fz ( z ) fz ( z ) fz (z ) - fz ( z ) .
Since this is the same for all x , it follows that the conditional distribution of X is
uniform on the interval [0, z] , with PDF fX l z (x I z ) = liz.
(c ) We have
z-
fX I Z ( x I z ) - fy ( x(z)fx (x)
fz )
We focus on the terms in the exponent. By completing the square, we find that the
negative of the exponent is of the form
(z - 2
"":""-----,X)
,...:- +
-
x2
-
(7; + (7� (
x - Z(7; 2 + Z 2
- I
)
- (7; ( )
2(7; - 2(7; (7� (7; + (7� (7; + (7� .
---;<'"---=----;<'"
2(7� 2(7�
Thus, the conditional PDF of X is of the form
(7 2 + (7 2 ( )}
Z(7x2 2
fx I z (x I z ) = c( z ) . exp { - x 2 2y X -
2 (7x (7y (7x2 + (7y2 '
where c(z ) does not depend on x and plays the role of a normalizing constant. We
recognize this a normal distribution with mean
as

(7x2
E [X I Z = z] = (7x2 + (7y2
Z,

and variance ; �
var(X I Z = z ) = (7x(72 +(7(7y2 '

You might also like