Professional Documents
Culture Documents
J.X J.X: Problems
J.X J.X: Problems
P R O B L E M S
Problem 1. Let X be uniformly distributed in the unit interval [0, 1] . Consider the
random variable Y = g(X ) , where
g(x) =
{ 21., if x ::; 1/3,
if x > 1 /3.
Find the expected value of Y by first deriving its PMF. Verify the result using the
expected value rule.
fx ( x ) = '2.A e - >' I x l ,
where .A is a positive scalar. Verify that fx satisfies the normalization condition, and
evaluate the mean and variance of X .
Problem 3. * Show that the expected value of a discrete or continuous random vari
able X satisfies
where for the second equality we have reversed the order of integration by writing the
set { (x. y) 1 0 ::; x < oc. x ::; y < oc } as { (x. y) 1 0 ::; x ::; y , 0 ::; y < oo } . Similarly. we
can show that
=
J. P(X < -x) dx - 1°= yfx (y) dy.
Problems 185
= L (IV Px (y ) dx )
y>O 0
where g+ (x) = max{ g(x) O } , and g - (x) = max{ -g(x) , O } . In particular. for any t 2::: 0,
,
Problem 6. Calamity Jane goes t o the bank to make a withdrawal, and i s equally
likely to find 0 or 1 customers ahead of her. The service time of the customer ahead,
if present, is exponentially distributed with parameter >. . What is the CDF of Jane's
waiting time?
Problem 7. Alvin throws darts at a circular target of radius r and is equally likely
to hit any point in the target. Let X be the distance of Alvin's hit from the center.
(a) Find the PDF, the mean. and the variance of X.
(b) The target h as an inner circle o f radius t. I f X � t, Alvin gets a score of S = 1 / X.
Otherwise his score is S =O. Find the CDF of S. Is S a continuous random
variable?
fx (x ) = pfy (x) + (1 - p) fz (x ) .
(b) Calculate the CDF of the two-sided exponential random variable that h as PDF
given by
p>.eAX ,
fx (x) - ( 1 p)>.
_
{ i f x < 0,
e-AX • if x 2: 0,
_
mixed random variable and its CDF is given. using the total probability theorem, by
Fx (x ) = P ( X � x )
= pP ( Y � x ) + (1 - p)P(Z � x )
= p Fy (x ) + ( 1 p)Fz(x).
-
Its expected value is defined in a way that conforms to the total expectation theorem:
E [X ] = pE[Y] + ( 1 - p)E[Z] .
The taxi stand and the bus stop near AI's home are in the same location. Al goes
there at a given time and if a taxi is waiting (this happens with probability 2/3) he
Problems 187
boards it. Otherwise he waits for a taxi or a bus to come, whichever comes first. The
next taxi will arrive in a time that is uniformly distributed between 0 and 1 0 minutes,
while the next bus will arrive in exactly 5 minutes. Find the CDF and the expected
value of AI 's waiting time.
Solution. Let A be the event that Al will find a taxi waiting or will be picked up by
the bus after 5 minutes. Note that the probability of boarding the next bus, given that
Al has to wait, is
py (y) {�
3P A) '
if Y = 0,
�
6P A) '
if Y = 5,
12
{� if y = 0,
= 5)
if y = 5.
15 '
[This equation follows from the calculation
The CDF is given by Fx (x) = P(A)Fy (x) + (1 - P(A) ) Fz (x), from which
{ 0, if x < 0,
5 12 1 x
Fx (x) = + if O :::; x < 5,
5'
_ . _
6 15 6
_ . -
1, i f 5 :::; x .
The expected value of the waiting time is
5 3 . 1 5 15
E[X] = P(A) E[Y] + ( 1 - P(A)) E[ Z] = -6 . -
15
5 + - . - = -.
6 2 12
188 General Random Variables Chap . 3
where the last equality follows because U is uniform. Thus, X has the desired CDF.
( b ) The exponential CDF has the form F(x) = 1 - e - >'x for x � O. Thus, to generate
values of X. we should generate values u E (0. 1) of a uniformly distributed random
variable U. and set X to the value for which 1 - e - >' x = u, or x = - In (1 - u ) / ,x.
(c ) Let again F be the desired CDF. To any u E (0. 1 ) , there corresponds a unique
integer Xu such that F(xu - 1 ) < u � F(xu) . This correspondence defines a random
variable X as a function of the random variable U. We then have. for every integer k .
P(X = k) = P ( F(k - 1) < U � F(k) ) = F(k) - F(k - 1).
Therefore, the CDF of X i s equal t o F, as desired.
Problem 1 2 . Let X be a normal random variable with zero mean and standard
deviation u. Use the normal tables to compute the probabilities of the events {X � ku}
and { I XI � ku} for k = 1 , 2, 3.
Problems 189
Problem 13. A city's temperature is modeled as a normal random variable with mean
and standard deviation both equal to 10 degrees Celsius. What is the probability that
the temperature at a randomly chosen time will be less than or equal to 59 degrees
Fahrenheit?
Problem 14. * Show that the normal PDF satisfies the normalization property. Hint:
2
The integral J�x e - x / 2 dx is equal to the square root of
1 =x 1 00 e -x 2 / 2 e - y2 / 2 dx dy,
- - 00
=
21T"
-
1 1271" 1 = e
-r
2 / 2 r dr dO
0 0
= 1,
the fifth equality, we use the change of variables u = r 2 / Thus, we have 2
where for the third equality, we use a transformation into polar coordinates, and for
.
1= _1_ e -x2 / 2 dx -- 1
..j'i;
,
00
because the integral is positive. Using the change of variables u = (x - J-l)/u, it follows
that
x fx (x) dx = X 1 e - (x - /L)2 /2172 dx = l OG 1 e -u2 /2 du
I - :x; I = v'2ii u
-
= ..j'i; _
__
= 1.
Problem 16. Consider the following variant of Buffon ' s needle problem ( Example
3. 1 1 ) ' which was investigated by Laplace. A needle of length I is dropped on a plane
surface that is partitioned in rectangles by horizontal lines that are a apart and vertical
lines that are b apart. Suppose that the needle's length I satisfies 1 < a and 1 < b. What
is the expected number of rectangle sides crossed by the needle? What is the probability
that the needle will cross at least one side of some rectangle?
from a common and known PDF fy. Let S be the set of all possible values of Yi ,
S = { y I fy (y) > O}. Let X be a random variable with known PDF fx , such that
fx(y) = 0, for all y ¢ S . Consider the random variable
Show that
E[Z] = E[X] .
Solution. We have
fx (x) { X0,/4,
=
x
i f 1 < ::; 3,
otherwise,
fx( x { cx- 2 ,
) =
0,
if 1 ::; � ::; 2 ,
otherwIse.
Problems 191
Problem 20. An absent-minded professor schedules two student appointments for the
same time. The appointment durations are independent and exponentially distributed
with mean thirty minutes. The first student arrives on time, but the second student
arrives five minutes late. What is the expected value of the time between the arrival
of the first student and the departure of the second student?
Problem 22. We have a stick o f unit length, and we consider breaking i t i n three
pieces using one of the following three methods.
(i) We choose randomly and independently two points on the stick using a uniform
PDF, and we break the stick at these two points.
( ii) We break the stick at a random point chosen by using a uniform PDF, and then
we break the piece that contains the right end of the stick. at a random point
chosen by using a uniform PDF.
(iii) We break the stick at a random point chosen by using a uniform PDF, and then
we break the larger of the two pieces at a random point chosen by using a uniform
PDF.
For each of the methods (i) , (ii), and (iii) , what is the probability that the three pieces
we are left with can form a triangle?
Problem 23. Let the random variables X and Y have a joint PDF which is uniform
over the triangle with vertices at (0, 0) , (0, 1 ) , and ( 1 . 0).
(a) Find the joint PDF of X and Y .
( b ) Find the marginal P D F o f Y .
(c) Find the conditional PDF of X given Y .
(d) Find E [X I Y = yj , and use the total expectation theorem t o find E [X] i n terms
of E [Y] .
(e) Use the symmetry of the problem to find the value of E[X].
192 General Random Variables Chap. 3
Problem 24. Let X and Y be two random variables that are uniformly distributed
over the triangle formed by the points (0, 0), (1. 0) , and (0, 2) (this is an asymmetric
version of the PDF in the previous problem) . Calculate E [X] and E[Y] by following
the same steps as in the previous problem.
Problem 25. The coordinates X and Y of a point are independent zero mean normal
random variables with common variance (7 2 . Given that the point is at a distance of
at least c from the origin. find the conditional joint PDF of X and Y.
(
var TI ,=1 n X) n
' (
var ( X, ) )
TI �=1 E[X; ]
= g E [Xl l + 1 - 1.
Solution. We have
= II E [X;]
i= 1
- II,=1 (E[X,J) 2
n
II (var(X, ) + ( E[X,J ) 2 ) II ( E[Xi J) 2 .
- i=1
n
=
1=1
The desired result follows by dividing both sides by
,=1
fx .Ylc (x, y) =
{ fx.YP(C)(x, y)
, if (x , y ) E A,
0, otherwise.
fX ( x) =
{ p).. e -AX ,
_
if x � 0,
( 1 p ) ..e AX , if x < 0,
where ).. and p are scalars with )" > ° and p E [0, 1]. Find the mean and the variance
of X in two ways:
(a) By straightforward calculation of the associated expected values.
(b) By using a divide-and-conquer strategy, and the mean and variance of the (one
sided) exponential random variable.
Solution. (a)
E[X] = f: xfx (x) dx
= 1° x ( 1 - p) .. eAX dx + 1 00 xp)..e -AX dx
00
__
-
I_-_p + � °
)..
=
)..
- 2p - 1
)..
E[X 2 ] = f: x2 fx ( x) dx
= 22 - p) 2p
-
(1
).. 2 + ).. 2
).. 2 '
and
var(X) =
2 - (2P - 1 ) 2
).. 2 )..
(b) Let A be the event {X � O}, and note that P(A) p. Conditioned on A, the
=
random variable X has a (one-sided) exponential distribution with parameter ).. . Also,
conditioned on AC• the random variable -X has the same one-sided exponential dis
tribution. Thus,
E[X I A] = >:1 '
and
It follows that
E[X] = P(A)E[X I A] + P(AC)E[X l AC]
-p).. - --1 -p
)..
2p - l
)..
194 General Random Variables Chap. 3
Problem 29. * Let X, Y. and Z be three random variables with joint PDF /x.y.z .
Show the multiplication rule:
- /x ,Y, z (x, y , z )
/x I y,Z ( x I Y, z ) -
/Y.Z ( Y , Z ) ,
and
/Y, z (y, z ) = /Y IZ ( Y I z ) /z ( z ) .
Combining these two relations, we obtain the multiplication rule.
Problem 30. * The Beta PDF. The beta PDF with parameters Q > 0 and {3 > 0
has the form
/x (x ) =
{ 1
B( o: , ,8)
x
o-1 1
( _
X) i3- 1 , if 0 :5 x :5 1 ,
0, otherwise.
The normalizing constant is
( 0: - 1 ) ' ((3 I ) !
B( o: ,. (3 ) =
(0: + (3 - 1 ) !
so that
m
0: 0: + (3 +
( 1)··
E[X 1 = ( + (3)(0: 0: + 1 ). . 0:. . + -
( m 1)
( 0: + (3 + m - 1 )
.
Problems 195
E[X m ] =
1
B ( o , )3 )0
11 xm x0 - 1 ( 1 _
13 1
x) -
B o + m )3
dX = (B o. )3 , ) ·
( )
Then,
P(A) =
1
( 0 + )3 + 1 ) ! '
because all ways of ordering these 0 + )3 + 1 random variables are equally likely.
Consider the following two events:
B = { max { Y1 , . . . , Yo } � Y},
p(B n C) =
/.1 p(B n C I Y = y) Jy (y) dy
= /. p ( max { Y1 , . . . , Yo } � y � min { Yo+1 , . . . , Yo+13 } ) dy
1
We also have
p(A I B n C) = 0!11'! '
because given the events B and C, all o! possible orderings of Y1 , , Yo are equally
likely, and all f'! possible orderings of Yo+! , . . . , Yo +13 are equally likely.
• • •
�
(" + + I )
' = " ,1m /.' yO ( 1 - y)" dy ,
196 General Random Variables Chap. 3
or
11o
0:.' {3 '.
Y ( 1 - y ) dy = ( 0: + (3 + I ) ! .
0< f3
outside the interval [a, b]. and xix (x) � for all x. Let Yr , i = 1 , . . . , n, be independent
c
random variables with values generated as follows: a point (Vi, Wt) is chosen at random
(according to a uniform PDF) within the rectangle whose corners are ( a, 0) , ( b, 0) . (a, ) c ,
and (b, ) and if W1 � Vi lx ( Vi ) the value of Yi is set to 1 , and otherwise it is set to O.
c , .
and
var ( Z) :::; 4n1 '
In particular, we have var(Z) --+ 0 as n --+ 00 .
Solution. We have
1) = P ( Wt Vi Ix (Vi))
P ( Yi =
- lb 1V1X(V)
-
0
�
c
1
( b - a)
dw dv
_ .:::.
a_ ..!!:.
c(b - a)
___
E [X ]
- c(b - a) '
The random variable Z has mean P ( Yi = 1) and variance
P Yr = 1) ( 1 - P ( Yi = 1))
var ( Z ) = ( n
.
Since 0 :::; ( 1 - 2p) 2 = 1 - 4p( 1 - p), we have p( 1 - p) � 1/4 for any p in [0 , 1] ' so it
follows that var(Z) :::; 1/ (4n) .
Problem 32. * Let X and Y be continuous random variables with joint PDF Ix ,Y .
Suppose that for any subsets A and B of the real line, the events { X E A} and {Y E B}
are independent. Show that the random variables X and Y are independent.
Problems 197
Solution. For any two real numbers x and y , using the independence of the events
{X � x} and {Y � y } , we have
82 Fx . y 8Fx 8Fy
fx.y (x, y) = 8x8y (x , y) = 8x (x) 8y (y) = fx (x) fy (y) ,
E[T I N = i] = iE[X],
since conditional on N = i, you will visit exactly i stores, and you will spend an
expected amount of money E[X] in each.
We now apply the total expectation theorem . We have
00
E[T] = L P ( N = i) E[T I N = i]
i =l
oc
= L P ( N = i)iE[X]
i=l
= E[X] L iP ( N = i)
1=1
= E[X] E[N].
198 General Random Variables Chap. 3
Similarly, using also the independence of the Xi, which implies that E[XiXj] = (E [X J) 2
if i =I- j , the second moment of T is calculated as
00
E[T2 ] = L P ( N = i ) E[T2 1 N = i]
,=1
00
i=1
00
= L P ( N = i) (iE [X 2] + i (i - 1) (E [XJ) 2 )
t=1 00 00
= E[X 2 ] L iP(N = i) + ( E[X J ) 2 L i(i - l ) P ( N = i)
i=1 t=1
= E [X 2 ] E [ N] + ( E [X J) 2 (E [ N2 ] - E [ NJ)
= var(X) E[N] + (E [XJ) 2 E[N2 ] .
The variance is then obtained by
var(T) = E [T2 ] (E [TJ) 2
_
(7x2
E [X I Z = z] = (7x2 + (7y2
Z,
and variance ; �
var(X I Z = z ) = (7x(72 +(7(7y2 '