Problem X Ixi - Problem X. X Problem IX1 1/3 IX1 1 /4 X. Problem
Problem X Ixi - Problem X. X Problem IX1 1/3 IX1 1 /4 X. Problem
Problem X Ixi - Problem X. X Problem IX1 1/3 IX1 1 /4 X. Problem
P R O B L E M S
Problem 2 . Find the PDF of eX in terms of the PDF of X. Specialize the answer
to the case where X is uniformly distributed between 0 and 1 .
Problem 3 . Find the PDFs of IX1 1 /3 and IX1 1 /4 in terms of the PDF of X.
Problem 4. The metro train arrives at the station near your home every quarter
hour starting at 6:00 a.m. You walk into the station every morning between 7: 1 0 and
7:30 a.m., with the time in this interval being a random variable with given PDF (d.
Example 3. 14, in Chapter 3). Let X be the elapsed time, in minutes, between 7:10 and
the time of your arrival. Let Y be the time that you have to wait until you board a
train. Calculate the CDF of Y in terms of the CDF of X and differentiate to obtain a
formula for the PDF of Y .
Problem 7 . Two points are chosen randomly and independently from the interval
[0. 1] according to a uniform distribution. Show that the expected distance between
the two points is 1/3.
px x =
() {1/3, if x = 1 , 2, 3,
0, otherwise,
py ( y )
1/2, if y 0,
{
1/3, if Y = 1 ,
1/6, if y - 2 ,
=
=
0, otherwise.
Find the PMF of Z = X + Y, using the convolution formula.
Problems 247
Problem 1 1 . Use the convolution formula to establish that the sum of two indepen
dent Poisson random variables with parameters A and J.L. respectively, is Poisson with
parameter A + J.L.
Problem 12. The random variables X, Y. and Z are independent and uniformly
distributed between zero and one. Find the PDF of X + Y + Z.
Problem 13. Consider a PDF that is positive only within an interval [a, b] and is
symmetric around the mean (a + b)/2. Let X and Y be independent random variables
that both have this PDF. Suppose that you have calculated the PDF of X + Y. How
can you easily obtain the PDF of X - Y?
Problem 14. Competing exponentials. The lifetimes of two light bulbs are
modeled as independent and exponential random variables X and Y, with parameters
A and J.L, respectively. The time at which a light bulb first burns out is
Z = min{X. Y } .
fy (y) = 71'( 1 1 y2 ) ,
+ - ex:: < y < ex:: .
(Y is called a Cauchy random variable . )
(b) Let Y b e a Cauchy random variable. Find the PDF of the random variable X ,
which i s equal to the angle between -71'/2 and 71'/2 whose tangent i s Y .
Solution. (a) We first note that Y i s a continuous, strictly monotonically increasing
function of X. which takes values between -ex:: and ex:: . as X ranges over the interval
[- 1 /2, 1/2] . Therefore, we have for all scalars y .
(b) We first compute the CDF of X and then differentiate to obtain its PDF. We have
for -71'/2 ::; x ::; 71'/2.
= -
1 jtanx1
dy
rr 00 1 + y2
-
= -1 tan - 1 y
tan x
rr I - oc
For x < -rr /2, we have P(X ::; x ) = 0, and for rr /2 < x, we have P(X ::; x ) = 1 .
Taking the derivative of the CDF P(X ::; x ) , we find that X is uniformly distributed
on the interval [-rr /2, rr /2] .
Note : An interesting property of the Cauchy random variable is that it satisfies
as can be easily verified. As a result. the Cauchy random variable does not have a well
defined expected value, despite the symmetry of its PDF around 0; see the footnote in
Section 3 . 1 on the definition of the expected value of a continuous random variable.
Problem 16. * The polar coordinates of two independent normal random
variables. Let X and Y be independent standard normal random variables. The pair
(X, Y) can be described in polar coordinates in terms of random variables R ;::: 0 and
8 E [0, 2rr] , so that
X = R cos e , Y = R sin 8.
(a) Show that 8 is uniformly distributed in [0, 2rr] , that R has the PDF
r ;::: 0,
and that R and 8 are independent. (The random variable R is said to have a
Rayleigh distribution. )
(b) Show that R2 h as an exponential distribution with parameter 1 /2.
Not e: Using the results in this problem, we see that samples of a normal random vari
able can be generated using samples of independent uniform and exponential random
variables.
Solution. (a) The joint PDF of X and Y is
We first find the joint CDF of R and 8. Fix some r > 0 and some B E [0. 2rr] . and
let A be the set of points ( x, y) whose polar coordinates (r, O) satisfy 0 ::; r ::; r and
o ::; 0 ::; B; note that the set A is a sector of a circle of radius r. with angle B. We have
a2 FR .e _ r _r 2 / 2
!R.e (r. 8) _
- ara8 (r. 8) - 2 e . r 2: 0, 8 E [0, 27f] .
7f
Thus,
2�
1
2
!R (r) = 0
!R.e (r, 8) d() = r e-r /2 , r 2: 0.
Furthermore,
[0, 2 7f] .
I !R , e(r, 8)
J el R (8 I r ) = = � 8 E
fR (r) 2 7f '
Since the conditional PDF !el R of e is unaffected by the value of the conditioning
variable R. it follows that it is also equal to the unconditional PDF !e . In particular,
!R ,e (r. ()) = !R ( r )!e (8) , so that R and e are independent.
( b) Let t 2: 0. We have
x re-r2 /2 dr = lX e - u du = e - t/2
P ( R2 2: t) = P ( R 2: Vi) =
J vt t /2
•
- !2 e - t/2 ,
!R2 (t) - t 2: 0.
E[W] = E[X] = E [ Y ] = E [ Z] = 0,
and let
Y = a + bX + cX 2 .
Find the correlation coefficient p(X. Y).
250 Further Topics on Random Variables Chap. 4
Problem 20. * Schwarz inequality. Show that for any random variables X and Y.
we have
<
0- E [( X - E[XY]
E[Y2]
y
2 )]
= [
E X 2 - 2 E[XY ] XY + (
E[Y2]
E[XY J ) 2 y 2
( E[Y2J ) 2
]
2
+ ( E[X Y] )
E[X 2 ] 2 E[XY]
_
cov ( X. Y)
p(X. Y)
vvar(X) var (Y)
=
of two random variables X and Y that have positive variances. Show that:
( a) Ip(X, Y)I � 1 . Hint : Use the Schwarz inequality from the preceding problem.
( b ) If Y - E[Y] is a positive ( or negative) multiple of X E[X], then p(X, Y) = 1
[or p(X. Y) - 1 , respectively] .
=
-
( c ) If p(X, Y) 1 [or p(X, Y) -1]. then, with probability 1 . Y - E[Y] is a positive
=
(E [X y] ) 2
( p (X , y ) ) 2 =
E[X2] E[Y2] -
<1
,
and hence Ip(X, Y)I � 1 .
(b) If Y = aX, then
E[X aX] .5:.. .
p(X. Y) =
JE[X2] E[(aX)2]
=
l al
Problems 251
= 0.
X- - E[XY] Y-
_
E[y 2 ]
is equal to zero. It follows that, with probability I,
X- = E[XY]
_ Y=
-
E[Y2]
i.e. , the sign of the constant ratio of X and Y i s determined by the sign o f p(X, Y).
probabilities p and 1 -
Problem 22. Consider a gambler who at each gamble either wins or loses his bet with
p, independent of earlier gambles. When p > 1/2,
a popular
gambling system , known as the Kelly strategy, is to always bet the fraction 2p of -1
the current fortune. Compute the expected fortune after n gambles, starting with x
units and employing the Kelly strategy.
Problem 23. Pat and Nat are dating, and all of their dates are scheduled to start at
9 p . m . Nat always arrives promptly at 9 p.m. Pat is highly disorganized and arrives at
a time that is uniformly d istributed between 8 p.m. and 10 p.m. Let X be the time in
hours between 8 p . m . and the time when Pat arrives. If Pat arrives before 9 p.m., their
( a) What is the expected number of hours Nat waits for Pat to arrive?
(c) What is the expected number of dates they will have before breaking up?
Problem 24. A retired professor comes to the office at a time which is uniformly
distributed between 9 a.m . and 1 p . m . , performs a single task, and leaves when the task
is completed. The duration of the task is exponentially d istributed with parameter
,X(y) = ( 5 - y), where y is the length of the time interval between 9 a.m . and the
1/
time of his arrival.
252 Further Topics on Random Variables Chap. 4
(a) What is the expected amount of time that the professor devotes to the task?
(b) What is the expected time at which the task is completed?
(c) The professor has a Ph.D. student who on a given day comes to see him at a
time that is uniformly distributed between 9 a.m. and 5 p.m. If the student does
not find the professor, he leaves and does not return. If he finds the professor, he
spends an amount of time that is uniformly distributed between 0 and 1 hour.
The professor will spend the same total amount of time on his task regardless of
whether he is interrupted by the student. What is the expected amount of time
that the professor will spend with the student and what is the expected time at
which he will leave his office?
Problem 25. * Show that for a discrete or continuous random variable X, and any
function g(Y ) of another random variable Y, we have E[Xg(Y) I YJ = g(Y) E[X I YJ .
Solution. Assume that X is continuous. From a version of the expected value rule for
conditional expectations given in Chapter 3, we have
E[Xg(Y) I Y = yJ = 1: xg(y) jx IY (X I y) dx
= g(y) I: x jxlY ( X I y) dx
= g(y) E[X I Y = yJ .
This shows that the realized values E[Xg(Y) I Y = yJ and g(y)E[X I Y = yJ of the
random variables E[Xg(Y) I YJ and g(Y)E[X I YJ are always equal. Hence these two
random variables are equal. The proof is similar if X is discrete.
Problem 26. * Let X and Y be independent random variables. Use the law of total
variance to show that
We have
E[Z I XJ = E[XY I XJ = XE[Y] ,
so that
var ( E[Z I Xl ) = var ( XE[YJ ) = ( E[Yl ) var(X ) .
2
Furthermore,
so that
2
E [var(Z I X) ] = E[X 2 ]var(Y) = ( E[Xl ) var(Y) + var(X)var(Y ) .
Problems 253
Problem 27. * We toss n times a biased coin whose probability of heads, denoted by
q, is the value of a random variable Q with given mean J.L and positive variance (j2 . Let
X1 be a Bernoulli random variable that models the outcome of the ith toss (i.e., Xi = 1
if the ith toss is a head). We assume that Xl , . . . , Xn are conditionally independent,
given Q = q. Let X be the number of heads obtained in the n tosses.
(a) Use the law of iterated expectations to find E [X1J and E[XJ .
(b) Find COV(Xi , Xj ) . Are Xl , . . . , Xn independent?
(c) Use the law of total variance to find var(X). Verify your answer using the co
variance result of part (b) .
Solution. (a) We have, from the law of iterated expectations and the fact E[Xi I QJ = Q,
and
Thus,
= E[XiJ - ( E[XiJ )
2
2
= J.L - J.L .
254 Further Topics on Random Variables Chap. 4
(c) Using the law of total variance, and the conditional independence of Xl , . . . , Xn ,
we have
To verify the result using the covariance formulas of part (b), we write
var(X) = var(X1 + . . . + Xn )
n
= L var(Xd + L COV(Xi , Xj )
i= l {(i,j) I i#j }
= nvar(X1 ) + n(n - I )COV(Xl , X2 )
= n( J-L - J-L2 ) + n( n - 1 )0' 2 .
Problem 28. * The Bivariate Normal PDF . The (zero mean) bivariate normal
PDF is of the form
fx,Y(x, y) = ce - q ( x ,y) ,
where the exponent term q(x, y) is a quadratic function of x and y,
q(x, y) =
O'x and O'y are positive constants, p is a constant that satisfies - 1 < p < 1 , and c is a
normalizing constant.
(a) By completing the square, rewrite q(x, y) in the form ( ax - {3y ) 2 + ,y2 for some ,
(d) Show that the conditional PDF of X given that Y = y is normal, and identify
its conditional mean and variance.
(e) Show that the correlation coefficient of X and Y is equal to p.
(f) Show that X and Y are independent if and only if they are uncorrelated.
(g) Show that the estimation error E[X I Y] - X is normal with mean zero and
variance ( 1 - / )0'; , and is independent from Y.
Problems 255
q ( x, y) = q l ( x , y) + q2 ( y) ,
where
1 Y)
= 2( 1 - p2) ax - p ay
(x 2
q x , y)
d . and
( b ) We have
we obtain
Thus,
fy ( y) = cax Jl - p2 y'2;: e-y 2 / 2 (7� .
We recognize this as a normal PDF with mean zero and variance a�. The result for
the random variable X follows by symmetry.
( c ) The normalizing constant for the PDF of Y must be equal to 1/ ( v'27f ay). It follows
that
( d ) Since
fx,y ( x , y ) =
1 e-q1 (x, y )e-Q2 ( y ) ,
27rax ay y'1 - p2
and
we obtain
(x ) fx,Y ( x , y) _ { - (x paxy/ay)2 } .
IY -
1
fx I y exp
-
- -
fy ( y) v'27fax Jl - p2 2ai (l - p2 )
256 Further Topics on Random Variables Chap. 4
For any fixed y, we recognize this as a normal PDF with mean (pax/ay)y, and variance
a; (1 - p2 ) . In particular, E [ X I Y = y] = (pax/ay)Y. and E[X I Y ] = (pax/ay)Y.
(e) Using the expected value rule and the law of iterated expectations, we have
E[XY] = E [ E[X Y I YJ ]
= E [ Y E[X I Y:i ]
= E [Y(pax/l1y)Y]
a
= p X E[ y 2 ]
ay
= paxay .
cov (X . Y ) [XYj
p( X, Y) =E
axay p.
= =
axay
(f) If X and Y are uncorrelated, then p = 0, and the joint PDF satisfies fx.y (x, y) =
fx (x) fy (y) , so t hat X and Y are independent. Conversely, if X and Y are independent,
then they are automatically uncorrelated.
(g) From part (d) . we know that conditioned on Y = y, X is normal with mean
E[X I Y = yj and variance (1 - /)a;. Therefore, conditioned on Y = y, the estimation
error X = E[X I Y = yj - X is normal with mean zero and variance (1 - p2 )a;, i.e ..
Since the conditional PDF of X does not depend on the value y of y , it follows that
X is independent of Y , and the above conditional PDF is also the unconditional PDF
of X.
Find the transform associated with X and use it to obtain the first three moments,
E[Xj . E [ X2 ] , E [ X 3 j .
Problem 30. Calculate E[X 3 ] and E[X4] for a standard normal random variable X.
Problem 31. Find the third, fourth, and fifth moments of an exponential random
variable with parameter 'x .
Problems 257
Problem 32. A nonnegative integer-valued random variable X has one of the follow
ing two expressions as its transform:
S
1 . !vf(8) = e 2 ( ee - l _ l ) .
2. M ( 8 ) = e 2 ( e e s - 1) .
(a) Explain why one of the two cannot possibly be the transform.
(b) Use the true transform to find P(X = 0).
Problem 33. Find the PDF of the continuous random variable X associated with
the transform
2 2 3
- . -- - . -- .
1
M (s) = +
3 2-8 3 3-8
Problem 34. A soccer team has three designated players who take turns striking
penalty shots. The ith player has probability of success PI , independent of the successes
of the other players. Let X be the number of successful penalty shots after each player
has had one turn. Use convolution to calculate the PMF of X. Confirm your answer
by first calculating the transform associated with X and then obtaining the PMF from
the transform.
Problem 35. Let X be a random variable that takes nonnegative integer values, and
is associated with a transform of the form
3 + 4e 2 s + 2e 3s
A1x (s) = c , ,
3 - eS
where c is some scalar. Find E[X] , px ( 1 ) , and E[X I X 1= 0] .
Problem 37. A pizza parlor serves n different types of pizza, and is visited by a
number K of customers in a given period of time, where K is a nonnegative integer
random variable with a known associated transform MK (S) = E[es K ] . Each customer
orders a single pizza. with all types of pizza being equally likely, independent of the
number of other customers and the types of pizza they order. Give a formula, in terms
of l\1K (')' for the expected number of different types of pizzas ordered.
Problem 38. * Let X be a discrete random variable taking nonnegative integer values.
Let M ( s ) be the transform associated with X.
(a) Show that
P(X = 0) = lim M ( s ) .
s --oo
258 Further Topics on Random Variables Chap. 4
Al(s) = L P ( X = k)e k s .
k =O
As s ---+ - oc , all the terms e k s with k > 0 tend to 0, so we obtain lim s _ - ::>e A1(s) =
P(X = 0) .
( b ) In the case of the binomial, we have from the transform tables
P ( X = k) = lim e -
sk
s � - oo
Al(s).
( b ) Find the transform associated with a continuous random variable X that is uni
formly distributed in the range [a, bl .
Solution. ( a ) The PMF of X is
if k = a, a + 1 . . . . , b,
otherwise.
The transform is
A1(s) = L es k p (X = k)
k = - oc
Problems 259
b
-L 1 e sk
k=a b - a + l
b-a
e sa � e sk
b - a + l L..,.
_
k=O
esa es(b - a+l ) -1
b-a+ l
(b) We have
e Sb sa
M(s) = E[es x 1 = be dx = s(b -- e ) .
b Sx
a Ja a
_
Problem 40. * Suppose that the transform associated with a discrete random variable
X has the form
A(eS)
M(s) = B(e s) '
where A(t) and B(t) are polynomials of the generic variable t . Assume that A(t) and
B(t) have no common roots and that the degree of A(t) is smaller than the degree of
B(t). Assume also that B(t) has distinct, real, and nonzero roots that have absolute
value greater than 1 . Then it can be seen that M(s) can be written in the form
where I/ T l , . . . , I/Tm are the roots of B(t) and the ai are constants that are equal to
limes _ .!. ( 1 - TieS)M(s), i = 1 , . . . , m.
ri
(a) Show that the PMF of X has the form
P(X = k) = {f t= l
ai Tf ' if k = 0, 1 , . . . ,
0, otherwise.
Note: For large k , the PMF of X can be approximated by a-;TI' where I is the
index corresponding to the largest ITil (assuming I is unique) .
(b) Extend the result of part (a) to the case where M(s) = eb s A( e S)/ B( e S ) and b is
an integer.
Solution. (a) We have for all s such that ITi le s < 1
Therefore,
260 Further Topics on Random Variables Chap. 4
I: ai r:
m
P(X = k)
t=l
for k ;::: 0, and P(X = k) = 0 for k < O. Note that if the coefficients a, are nonnegative,
this PMF is a mixture of geometric PMFs.
(b) In this case, AI ( s ) corresponds to the translation by b of a random variable whose
transform is A(e8 ) / B(e8 ) (cf. Example 4.25 ) , so we have
Problem 42. Construct an example to show that the sum of a random number of
independent normal random variables is not normal (even though a fixed sum is ) .
Problem 43. A motorist goes through 4 lights, each of which is found to be red with
probability 1 / 2 The waiting times at each light are modeled as independent normal
.
random variables with mean 1 minute and standard deviation 1 /2 minute. Let X be
the total waiting time at the red lights.
( a) Use the total probability theorem to find the PDF and the transform associated
with X , and the probability that X exceeds 4 minutes. Is X normal?
( b) Find the transform associated with X by viewing X as a sum of a random number
of random variables.
Problem 44. Consider the calculation of the mean and variance of a sum
Y = X1 + " , + XN ,
( a) Derive formulas for E[N] and var(N) i n terms of E[lHJ , var(Af ) , E [K ] , var ( K ) .
( b ) Derive formulas for E[Y] and var ( Y ) in terms of E [.M J , var ( M ) , E[K] , var ( K ) ,
E [X] . var (X ) .
Problem 45. * Use transforms to show that the sum of a Poisson-d istributed number
of independent. identically distributed Bernoulli random variables is Poisson .
be the corresponding sum. The transform associated with L is found by starting with
the transform associated with N, which is
Mx (s) = I - p + pes .
We obtain
>. ( l -p+pes - l ) >'p(e s 1)
ML ( S ) e e
_ _
- - .
This is the transform associated with a Poisson random variable with parameter >.p.