Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Problem X Ixi - Problem X. X Problem IX1 1/3 IX1 1 /4 X. Problem

Download as pdf or txt
Download as pdf or txt
You are on page 1of 16

246 Further Topics on Random Variables Chap.

P R O B L E M S

SECTION 4.1. Derived Distributions


Problem 1 . If X is a random variable that is uniformly distributed between - 1 and
1 . find the PDF of JfXi and the PDF of In IXI .

Problem 2 . Find the PDF of eX in terms of the PDF of X. Specialize the answer
to the case where X is uniformly distributed between 0 and 1 .

Problem 3 . Find the PDFs of IX1 1 /3 and IX1 1 /4 in terms of the PDF of X.
Problem 4. The metro train arrives at the station near your home every quarter
hour starting at 6:00 a.m. You walk into the station every morning between 7: 1 0 and
7:30 a.m., with the time in this interval being a random variable with given PDF (d.
Example 3. 14, in Chapter 3). Let X be the elapsed time, in minutes, between 7:10 and
the time of your arrival. Let Y be the time that you have to wait until you board a
train. Calculate the CDF of Y in terms of the CDF of X and differentiate to obtain a
formula for the PDF of Y .

Problem 5. Let X and Y b e independent random variables, uniformly distributed


in the interval [0, 1]. Find the CDF and the PDF of I X - Y I .

Problem 6. Let X and Y b e the Cartesian coordinates of a randomly chosen point


(according to a uniform PDF ) in the triangle with vertices at (0, 1 ) , (0, - 1 ) , and ( 1 , 0) .
Find the CDF and the PDF of IX Y I .

Problem 7 . Two points are chosen randomly and independently from the interval
[0. 1] according to a uniform distribution. Show that the expected distance between
the two points is 1/3.

Problem 8. Find the PDF of Z = X + Y, when X and Y are independent exponential


random variables with common parameter A.
Problem 9. Consider the same problem as in Example 4.9, but assume that the
random variables X and Y are independent and exponentially distributed with different
parameters A and J.L , respectively. Find the PDF of X - Y .

Problem 10. Let X and Y be independent random variables with PMFs

px x =
() {1/3, if x = 1 , 2, 3,
0, otherwise,
py ( y )
1/2, if y 0,
{
1/3, if Y = 1 ,
1/6, if y - 2 ,
=
=

0, otherwise.
Find the PMF of Z = X + Y, using the convolution formula.
Problems 247

Problem 1 1 . Use the convolution formula to establish that the sum of two indepen­
dent Poisson random variables with parameters A and J.L. respectively, is Poisson with
parameter A + J.L.

Problem 12. The random variables X, Y. and Z are independent and uniformly
distributed between zero and one. Find the PDF of X + Y + Z.

Problem 13. Consider a PDF that is positive only within an interval [a, b] and is
symmetric around the mean (a + b)/2. Let X and Y be independent random variables
that both have this PDF. Suppose that you have calculated the PDF of X + Y. How
can you easily obtain the PDF of X - Y?

Problem 14. Competing exponentials. The lifetimes of two light bulbs are
modeled as independent and exponential random variables X and Y, with parameters
A and J.L, respectively. The time at which a light bulb first burns out is
Z = min{X. Y } .

Show that Z is an exponential random variable with parameter A + J.L .


Problem 15. * Cauchy random variable.
(a) Let X be a random variable that is uniformly distributed between - 1 /2 and 1/2.
Show that the PDF of Y = tan( 71' X ) is

fy (y) = 71'( 1 1 y2 ) ,
+ - ex:: < y < ex:: .
(Y is called a Cauchy random variable . )
(b) Let Y b e a Cauchy random variable. Find the PDF of the random variable X ,
which i s equal to the angle between -71'/2 and 71'/2 whose tangent i s Y .
Solution. (a) We first note that Y i s a continuous, strictly monotonically increasing
function of X. which takes values between -ex:: and ex:: . as X ranges over the interval
[- 1 /2, 1/2] . Therefore, we have for all scalars y .

Fy ( y) = P(Y ::; y ) = P (tan( 7rX ) ::; y) = p (7r X ::; tan - l y) = :; tan - l y + �,


where the last equality follows using the CDF of X . which is uniformly distributed in the
interval [- 1/2, 1 /2] . Therefore, by differentiation, using the formula d/dy ( tan- l y ) =
1 / ( 1 + y2 ) . we have for all y.
1
fy (y )=
71' ( 1 y2 ) '
+

(b) We first compute the CDF of X and then differentiate to obtain its PDF. We have
for -71'/2 ::; x ::; 71'/2.

P(X :s x ) = P(tan - 1 Y ::; x)


P(Y ::; tan x )
=
248 Further Topics on Random Variables Chap. 4

= -
1 jtanx1
dy
rr 00 1 + y2
-

= -1 tan - 1 y
tan x
rr I - oc

For x < -rr /2, we have P(X ::; x ) = 0, and for rr /2 < x, we have P(X ::; x ) = 1 .
Taking the derivative of the CDF P(X ::; x ) , we find that X is uniformly distributed
on the interval [-rr /2, rr /2] .
Note : An interesting property of the Cauchy random variable is that it satisfies

as can be easily verified. As a result. the Cauchy random variable does not have a well­
defined expected value, despite the symmetry of its PDF around 0; see the footnote in
Section 3 . 1 on the definition of the expected value of a continuous random variable.
Problem 16. * The polar coordinates of two independent normal random
variables. Let X and Y be independent standard normal random variables. The pair
(X, Y) can be described in polar coordinates in terms of random variables R ;::: 0 and
8 E [0, 2rr] , so that
X = R cos e , Y = R sin 8.

(a) Show that 8 is uniformly distributed in [0, 2rr] , that R has the PDF

r ;::: 0,

and that R and 8 are independent. (The random variable R is said to have a
Rayleigh distribution. )
(b) Show that R2 h as an exponential distribution with parameter 1 /2.
Not e: Using the results in this problem, we see that samples of a normal random vari­
able can be generated using samples of independent uniform and exponential random
variables.
Solution. (a) The joint PDF of X and Y is

We first find the joint CDF of R and 8. Fix some r > 0 and some B E [0. 2rr] . and
let A be the set of points ( x, y) whose polar coordinates (r, O) satisfy 0 ::; r ::; r and
o ::; 0 ::; B; note that the set A is a sector of a circle of radius r. with angle B. We have

FR.e (r, B) = P(R ::; 1', e ::; B) = p (( X, Y) E A)


Problems 249

where the last equality is obtained by transforming to polar coordinates. We then


differentiate, to find that

a2 FR .e _ r _r 2 / 2
!R.e (r. 8) _
- ara8 (r. 8) - 2 e . r 2: 0, 8 E [0, 27f] .
7f
Thus,
2�
1
2
!R (r) = 0
!R.e (r, 8) d() = r e-r /2 , r 2: 0.

Furthermore,
[0, 2 7f] .
I !R , e(r, 8)
J el R (8 I r ) = = � 8 E
fR (r) 2 7f '
Since the conditional PDF !el R of e is unaffected by the value of the conditioning
variable R. it follows that it is also equal to the unconditional PDF !e . In particular,
!R ,e (r. ()) = !R ( r )!e (8) , so that R and e are independent.
( b) Let t 2: 0. We have
x re-r2 /2 dr = lX e - u du = e - t/2
P ( R2 2: t) = P ( R 2: Vi) =
J vt t /2

where we have used the change of variables u = r 2 /2. By differentiating, we obtain

- !2 e - t/2 ,
!R2 (t) - t 2: 0.

SECTION 4.2. Covariance and Correlation


Problem 17. Suppose that X and Y are random variables with the same variance.
Show that X Y and X + Y are uncorrelated.
-

Problem 18. Consider four random variables. \IF. X. Y, Z. with

E[W] = E[X] = E [ Y ] = E [ Z] = 0,

var ( W ) = va r ( X ) = var ( Y) = var ( Z) = 1 .


and assume that W. X, Y. Z are pairwise uncorrelated. Find the correlation coefficients
p (R . S ) and p ( R . T ) . where R = W + X, S = X + Y. and T = Y + Z.
Problem 19. Suppose that a random variable X satisfies

and let
Y = a + bX + cX 2 .
Find the correlation coefficient p(X. Y).
250 Further Topics on Random Variables Chap. 4

Problem 20. * Schwarz inequality. Show that for any random variables X and Y.
we have

Solution. We may assume that E[y 2 ] i= 0; otherwise, we have Y = 0 with probability


1 . and hence E[XY] O. so the inequality holds. We have

<
0- E [( X - E[XY]
E[Y2]
y
2 )]
= [
E X 2 - 2 E[XY ] XY + (
E[Y2]
E[XY J ) 2 y 2
( E[Y2J ) 2
]
2
+ ( E[X Y] )
E[X 2 ] 2 E[XY]
_

E[Y2] E[X Y] 2 E[y 2 ]


(E[Y2J )
2 (E[XYJ ) 2
= E[X ] -
E[Y2] 1

i.e., ( E[x Yl ) 2 � E[X 2 ] E[ y2 ].

Problem 2 1 . * Correlation coefficient. Consider the correlation coefficient

cov ( X. Y)
p(X. Y)
vvar(X) var (Y)
=

of two random variables X and Y that have positive variances. Show that:
( a) Ip(X, Y)I � 1 . Hint : Use the Schwarz inequality from the preceding problem.
( b ) If Y - E[Y] is a positive ( or negative) multiple of X E[X], then p(X, Y) = 1
[or p(X. Y) - 1 , respectively] .
=
-
( c ) If p(X, Y) 1 [or p(X, Y) -1]. then, with probability 1 . Y - E[Y] is a positive
=

( or negative. respectively ) multiple of X - E[X].


Solution. (a) Let X = X - E[X] and Y = Y - E[Y]. Using the Schwarz inequality,
we get

(E [X y] ) 2
( p (X , y ) ) 2 =

E[X2] E[Y2] -
<1
,
and hence Ip(X, Y)I � 1 .
(b) If Y = aX, then
E[X aX] .5:.. .
p(X. Y) =
JE[X2] E[(aX)2]
=

l al
Problems 251

(c) If (p(X, y») 2 = 1, the calculation in the solution of Problem 20 yields


-- )]
X - E[�Y ] y
2
= E[X2 ] _
(E[Xy]) 2
_
E
[( E[y 2 ] E[Y 2 ]
2
= E[X 2 1 (1 - (p(X, y») )

= 0.

Thus, with probability 1 , the random variable

X- - E[XY] Y-
_

E[y 2 ]
is equal to zero. It follows that, with probability I,

X- = E[XY]
_ Y=
-
E[Y2]
i.e. , the sign of the constant ratio of X and Y i s determined by the sign o f p(X, Y).

SECTION 4.3. Conditional Expectation and Variance Revisited

probabilities p and 1 -
Problem 22. Consider a gambler who at each gamble either wins or loses his bet with
p, independent of earlier gambles. When p > 1/2,
a popular
gambling system , known as the Kelly strategy, is to always bet the fraction 2p of -1
the current fortune. Compute the expected fortune after n gambles, starting with x
units and employing the Kelly strategy.

Problem 23. Pat and Nat are dating, and all of their dates are scheduled to start at
9 p . m . Nat always arrives promptly at 9 p.m. Pat is highly disorganized and arrives at
a time that is uniformly d istributed between 8 p.m. and 10 p.m. Let X be the time in
hours between 8 p . m . and the time when Pat arrives. If Pat arrives before 9 p.m., their

that is uniformly distributed between 0 and 3 -


date will last exactly 3 hours. If Pat arrives after 9 p.m . , their date will last for a time
X hours . The date starts at the time
they meet. Nat gets irritated when Pat is late and will end the relationship after the
second date on which Pat is late by more than 45 minutes. All dates are independent
of any other dates.

( a) What is the expected number of hours Nat waits for Pat to arrive?

(b) What is the expected duration of any particular date?

(c) What is the expected number of dates they will have before breaking up?

Problem 24. A retired professor comes to the office at a time which is uniformly
distributed between 9 a.m . and 1 p . m . , performs a single task, and leaves when the task
is completed. The duration of the task is exponentially d istributed with parameter
,X(y) = ( 5 - y), where y is the length of the time interval between 9 a.m . and the
1/
time of his arrival.
252 Further Topics on Random Variables Chap. 4

(a) What is the expected amount of time that the professor devotes to the task?
(b) What is the expected time at which the task is completed?
(c) The professor has a Ph.D. student who on a given day comes to see him at a
time that is uniformly distributed between 9 a.m. and 5 p.m. If the student does
not find the professor, he leaves and does not return. If he finds the professor, he
spends an amount of time that is uniformly distributed between 0 and 1 hour.
The professor will spend the same total amount of time on his task regardless of
whether he is interrupted by the student. What is the expected amount of time
that the professor will spend with the student and what is the expected time at
which he will leave his office?

Problem 25. * Show that for a discrete or continuous random variable X, and any
function g(Y ) of another random variable Y, we have E[Xg(Y) I YJ = g(Y) E[X I YJ .
Solution. Assume that X is continuous. From a version of the expected value rule for
conditional expectations given in Chapter 3, we have

E[Xg(Y) I Y = yJ = 1: xg(y) jx IY (X I y) dx
= g(y) I: x jxlY ( X I y) dx

= g(y) E[X I Y = yJ .
This shows that the realized values E[Xg(Y) I Y = yJ and g(y)E[X I Y = yJ of the
random variables E[Xg(Y) I YJ and g(Y)E[X I YJ are always equal. Hence these two
random variables are equal. The proof is similar if X is discrete.
Problem 26. * Let X and Y be independent random variables. Use the law of total
variance to show that

var(XY) = ( E[XJ ) var(Y) + ( E[YJ ) var(X) + var(X)var(Y).


2 2

Solution. Let Z = XY. The law of total variance yields

var(Z) = var ( E[Z I X J ) + E [var(Z I X) ] .

We have
E[Z I XJ = E[XY I XJ = XE[Y] ,
so that
var ( E[Z I Xl ) = var ( XE[YJ ) = ( E[Yl ) var(X ) .
2

Furthermore,

var(Z I X) = var(XY I X) = X 2 var(Y I X) = X 2 var(Y),

so that
2
E [var(Z I X) ] = E[X 2 ]var(Y) = ( E[Xl ) var(Y) + var(X)var(Y ) .
Problems 253

Combining the preceding relations, we obtain

var(XY) = ( E[XJ ) 2 var(Y) + ( E [YJ ) 2 var(X) + var(X)var(Y ) .

Problem 27. * We toss n times a biased coin whose probability of heads, denoted by
q, is the value of a random variable Q with given mean J.L and positive variance (j2 . Let
X1 be a Bernoulli random variable that models the outcome of the ith toss (i.e., Xi = 1
if the ith toss is a head). We assume that Xl , . . . , Xn are conditionally independent,
given Q = q. Let X be the number of heads obtained in the n tosses.
(a) Use the law of iterated expectations to find E [X1J and E[XJ .
(b) Find COV(Xi , Xj ) . Are Xl , . . . , Xn independent?
(c) Use the law of total variance to find var(X). Verify your answer using the co­
variance result of part (b) .
Solution. (a) We have, from the law of iterated expectations and the fact E[Xi I QJ = Q,

Since X = Xl + . . . + Xn , it follows that

E [X] = E[XI ] + . . . + E[XnJ = nJ.L.

(b) We have, for i =1= j, using the conditional independence assumption,

and

Thus,

Since COV(Xi ' Xj ) > 0, Xl , . . . , Xn are not independent.


Also, for i = j, using the observation that xl = X1 ,

var(Xi ) = E [Xl ] - ( E[XiJ )


2

= E[XiJ - ( E[XiJ )
2
2
= J.L - J.L .
254 Further Topics on Random Variables Chap. 4

(c) Using the law of total variance, and the conditional independence of Xl , . . . , Xn ,
we have

var(X) = E [var(X I Q) ] + var ( E[X I QJ )


= E [var(Xl + . . . + Xn I Q) ] + var ( E[X 1 + . . . + Xn I QJ )
= E [nQ( 1 - Q) ] + var(nQ)
= nE[Q - Q 2 ] + n 2 var(Q)
2 2 2 2
= n( J-L - J-L - 0' ) + n 0'

= n( J-L - J-L 2 ) + n(n - 1)0' 2 .

To verify the result using the covariance formulas of part (b), we write

var(X) = var(X1 + . . . + Xn )
n
= L var(Xd + L COV(Xi , Xj )
i= l {(i,j) I i#j }
= nvar(X1 ) + n(n - I )COV(Xl , X2 )
= n( J-L - J-L2 ) + n( n - 1 )0' 2 .

Problem 28. * The Bivariate Normal PDF . The (zero mean) bivariate normal
PDF is of the form
fx,Y(x, y) = ce - q ( x ,y) ,
where the exponent term q(x, y) is a quadratic function of x and y,

q(x, y) =

O'x and O'y are positive constants, p is a constant that satisfies - 1 < p < 1 , and c is a
normalizing constant.
(a) By completing the square, rewrite q(x, y) in the form ( ax - {3y ) 2 + ,y2 for some ,

constants a, (3, and ,.


(b) Show that X and Y are zero mean normal random variables with variance 0';
and O'� , respectively.
( c) Find the normalizing constant c.

(d) Show that the conditional PDF of X given that Y = y is normal, and identify
its conditional mean and variance.
(e) Show that the correlation coefficient of X and Y is equal to p.

(f) Show that X and Y are independent if and only if they are uncorrelated.
(g) Show that the estimation error E[X I Y] - X is normal with mean zero and
variance ( 1 - / )0'; , and is independent from Y.
Problems 255

Solution. ( a ) We can rewrite q ( x, y) in the form

q ( x, y) = q l ( x , y) + q2 ( y) ,

where
1 Y)
= 2( 1 - p2) ax - p ay
(x 2
q x , y)
d . and

( b ) We have

Using the change of variables

we obtain

Thus,
fy ( y) = cax Jl - p2 y'2;: e-y 2 / 2 (7� .
We recognize this as a normal PDF with mean zero and variance a�. The result for
the random variable X follows by symmetry.
( c ) The normalizing constant for the PDF of Y must be equal to 1/ ( v'27f ay). It follows
that

which implies that

( d ) Since
fx,y ( x , y ) =
1 e-q1 (x, y )e-Q2 ( y ) ,
27rax ay y'1 - p2
and

we obtain

(x ) fx,Y ( x , y) _ { - (x paxy/ay)2 } .
IY -
1
fx I y exp
-

- -
fy ( y) v'27fax Jl - p2 2ai (l - p2 )
256 Further Topics on Random Variables Chap. 4

For any fixed y, we recognize this as a normal PDF with mean (pax/ay)y, and variance
a; (1 - p2 ) . In particular, E [ X I Y = y] = (pax/ay)Y. and E[X I Y ] = (pax/ay)Y.
(e) Using the expected value rule and the law of iterated expectations, we have

E[XY] = E [ E[X Y I YJ ]
= E [ Y E[X I Y:i ]
= E [Y(pax/l1y)Y]
a
= p X E[ y 2 ]
ay
= paxay .

Thus, the correlation coefficient p( X Y) is equal to


.

cov (X . Y ) [XYj
p( X, Y) =E
axay p.
= =
axay

(f) If X and Y are uncorrelated, then p = 0, and the joint PDF satisfies fx.y (x, y) =
fx (x) fy (y) , so t hat X and Y are independent. Conversely, if X and Y are independent,
then they are automatically uncorrelated.
(g) From part (d) . we know that conditioned on Y = y, X is normal with mean
E[X I Y = yj and variance (1 - /)a;. Therefore, conditioned on Y = y, the estimation
error X = E[X I Y = yj - X is normal with mean zero and variance (1 - p2 )a;, i.e ..

Since the conditional PDF of X does not depend on the value y of y , it follows that
X is independent of Y , and the above conditional PDF is also the unconditional PDF
of X.

SECTION 4.4. Transforms


Problem 29. Let X be a random variable that takes the values 1 . 2, and 3, with the
following probabilities:

P(X = 1) = 21 ' P(X = 2) = 41 ' P(X = 3) 1


= -.
4

Find the transform associated with X and use it to obtain the first three moments,
E[Xj . E [ X2 ] , E [ X 3 j .

Problem 30. Calculate E[X 3 ] and E[X4] for a standard normal random variable X.

Problem 31. Find the third, fourth, and fifth moments of an exponential random
variable with parameter 'x .
Problems 257

Problem 32. A nonnegative integer-valued random variable X has one of the follow­
ing two expressions as its transform:
S
1 . !vf(8) = e 2 ( ee - l _ l ) .
2. M ( 8 ) = e 2 ( e e s - 1) .
(a) Explain why one of the two cannot possibly be the transform.
(b) Use the true transform to find P(X = 0).

Problem 33. Find the PDF of the continuous random variable X associated with
the transform
2 2 3
- . -- - . -- .
1
M (s) = +
3 2-8 3 3-8

Problem 34. A soccer team has three designated players who take turns striking
penalty shots. The ith player has probability of success PI , independent of the successes
of the other players. Let X be the number of successful penalty shots after each player
has had one turn. Use convolution to calculate the PMF of X. Confirm your answer
by first calculating the transform associated with X and then obtaining the PMF from
the transform.

Problem 35. Let X be a random variable that takes nonnegative integer values, and
is associated with a transform of the form
3 + 4e 2 s + 2e 3s
A1x (s) = c , ,
3 - eS
where c is some scalar. Find E[X] , px ( 1 ) , and E[X I X 1= 0] .

Problem 36. Let X. Y, and Z be independent random variables, where X is Bernoulli


with parameter 1 /3, Y is exponential with parameter 2, and Z is Poisson with param­
eter 3.
(a) Consider the new random variable U = XY + (1 - X)Z. Find the transform
associated with U.
(b) Find the transform associated with 2Z + 3.
(c) Find the transform associated with Y + Z.

Problem 37. A pizza parlor serves n different types of pizza, and is visited by a
number K of customers in a given period of time, where K is a nonnegative integer
random variable with a known associated transform MK (S) = E[es K ] . Each customer
orders a single pizza. with all types of pizza being equally likely, independent of the
number of other customers and the types of pizza they order. Give a formula, in terms
of l\1K (')' for the expected number of different types of pizzas ordered.

Problem 38. * Let X be a discrete random variable taking nonnegative integer values.
Let M ( s ) be the transform associated with X.
(a) Show that
P(X = 0) = lim M ( s ) .
s --oo
258 Further Topics on Random Variables Chap. 4

( b ) Use part ( a) to verify that if X is a binomial random variable with parameters


n and p, we have P ( X = 0) = ( 1 - pt . Furthermore, if X is a Poisson random
variable with parameter .A. we have P ( X = 0 ) = e - A .
( c ) Suppose that X is instead known to take only integer values that are greater
than or equal to a given integer k. How can we calculate P(X = k) using the
transform associated with X?
Solution. ( a) We have
oc

Al(s) = L P ( X = k)e k s .
k =O
As s ---+ - oc , all the terms e k s with k > 0 tend to 0, so we obtain lim s _ - ::>e A1(s) =
P(X = 0) .
( b ) In the case of the binomial, we have from the transform tables

so that lims_ - oc Al(s) = ( 1 - p) n . In the case of the Poisson, we have

so that lim s _ - oc Al(s) = e - A •


(c ) The random variable Y = X k takes only nonnegative integer values and the
-

associated transform is A1y (s) = e - SkAl(s) ( cf. Example 4.25) . Since P ( Y = 0 ) =


P ( X k) , we have from part ( a) ,
=

P ( X = k) = lim e -
sk
s � - oo
Al(s).

Problem 39. * Transforms associated with uniform random variables.


( a ) Find the transform associated with an integer-valued random variable X that is
uniformly distributed in the range {a, a + 1 , . . , b}. .

( b ) Find the transform associated with a continuous random variable X that is uni­
formly distributed in the range [a, bl .
Solution. ( a ) The PMF of X is

if k = a, a + 1 . . . . , b,
otherwise.

The transform is

A1(s) = L es k p (X = k)
k = - oc
Problems 259

b
-L 1 e sk
k=a b - a + l
b-a
e sa � e sk
b - a + l L..,.
_

k=O
esa es(b - a+l ) -1
b-a+ l
(b) We have
e Sb sa
M(s) = E[es x 1 = be dx = s(b -- e ) .
b Sx
a Ja a
_

Problem 40. * Suppose that the transform associated with a discrete random variable
X has the form
A(eS)
M(s) = B(e s) '
where A(t) and B(t) are polynomials of the generic variable t . Assume that A(t) and
B(t) have no common roots and that the degree of A(t) is smaller than the degree of
B(t). Assume also that B(t) has distinct, real, and nonzero roots that have absolute
value greater than 1 . Then it can be seen that M(s) can be written in the form

where I/ T l , . . . , I/Tm are the roots of B(t) and the ai are constants that are equal to
limes _ .!. ( 1 - TieS)M(s), i = 1 , . . . , m.
ri
(a) Show that the PMF of X has the form

P(X = k) = {f t= l
ai Tf ' if k = 0, 1 , . . . ,

0, otherwise.

Note: For large k , the PMF of X can be approximated by a-;TI' where I is the
index corresponding to the largest ITil (assuming I is unique) .
(b) Extend the result of part (a) to the case where M(s) = eb s A( e S)/ B( e S ) and b is
an integer.
Solution. (a) We have for all s such that ITi le s < 1

Therefore,
260 Further Topics on Random Variables Chap. 4

and by inverting this transform, we see that

I: ai r:
m

P(X = k)
t=l
for k ;::: 0, and P(X = k) = 0 for k < O. Note that if the coefficients a, are nonnegative,
this PMF is a mixture of geometric PMFs.
(b) In this case, AI ( s ) corresponds to the translation by b of a random variable whose
transform is A(e8 ) / B(e8 ) (cf. Example 4.25 ) , so we have

a, rj '- b) , if k = b, b + 1 " " ,


P (X = k) = { �;
0, otherwise.

SECTION 4.4. Sum of a Random Number of Independent Random


Variables
Problem 41. At a certain time, the number of people that enter an elevator is a
Poisson random variable with parameter A. The weight of each person is independent
of every other person 's weight, and is uniformly distributed between 100 and 200 lbs.
Let Xi be the fraction of 100 by which the ith person exceeds 100 lbs, e.g. , if the 7th
person weighs 175 lbs. , then X7 0.75. Let Y be the sum of the Xi .
( a) Find the transform associated with Y .
(b ) Use the transform to compute the expected value of Y .
(c) Verify your answer to part (b ) by using the law of iterated expectations.

Problem 42. Construct an example to show that the sum of a random number of
independent normal random variables is not normal (even though a fixed sum is ) .

Problem 43. A motorist goes through 4 lights, each of which is found to be red with
probability 1 / 2 The waiting times at each light are modeled as independent normal
.

random variables with mean 1 minute and standard deviation 1 /2 minute. Let X be
the total waiting time at the red lights.
( a) Use the total probability theorem to find the PDF and the transform associated
with X , and the probability that X exceeds 4 minutes. Is X normal?
( b) Find the transform associated with X by viewing X as a sum of a random number
of random variables.

Problem 44. Consider the calculation of the mean and variance of a sum

Y = X1 + " , + XN ,

where N is itself a sum of integer-valued random variables, i.e.,


Problems 261

Here N, M , KI , K2 , . . . , Xl , X2 , . . . are independent random variables, N, M, KI , K2 , . . .


are integer-valued and nonnegative, K I , K2 , • • • are identically distributed with common
mean and variance denoted E [K] and var ( K ) , and Xl , X2 , ' " are identically distributed
with common mean and variance denoted E [X] and var(X ) .

( a) Derive formulas for E[N] and var(N) i n terms of E[lHJ , var(Af ) , E [K ] , var ( K ) .

( b ) Derive formulas for E[Y] and var ( Y ) in terms of E [.M J , var ( M ) , E[K] , var ( K ) ,
E [X] . var (X ) .

( c ) A crate contains !vI cartons, where M is geometrically distributed with parame­


ter p. The ith carton contains Ki widgets, where Ki is Poisson-distributed with
parameter JL. The weight of each widget is exponentially distributed with pa­
rameter >.. All these random variables are independent . Find the expected value
and variance of the total weight of a crate.

Problem 45. * Use transforms to show that the sum of a Poisson-d istributed number
of independent. identically distributed Bernoulli random variables is Poisson .

Solution . Let N be a Poisson-distributed random variable with parameter >.. Let Xi ,


i = 1 , . . . , N, be independent Bernoulli random variables with parameter P, and let

be the corresponding sum. The transform associated with L is found by starting with
the transform associated with N, which is

and replacing each occurrence of eS by the transform associated with Xi , which is

Mx (s) = I - p + pes .

We obtain
>. ( l -p+pes - l ) >'p(e s 1)
ML ( S ) e e
_ _

- - .

This is the transform associated with a Poisson random variable with parameter >.p.

You might also like