Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Probc

Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

PROPERTIES OF EXPECTATIONS

Let X be a r.v. a,b be constants

Then

(a) E (aX + b ) = a E ( x ) + b

(b) Var (aX + b ) = a 2 Var (X )

If X 1 , X 2 ...... X n are any n rvs,

E(X 1 + X 2 + ....... + X n ) = E(X 1 ) + E(X 2 ) + .... + E(X n )

But if X 1 ,.....X n are n indep rvs then

Var (X 1 + X 2 + ..... + X n ) = Var (X 1 ) + Var (X 2 ) + .... + Var (X n )

In particular if X,Y are independent

Var (X + Y ) = Var(X − Y ) = Var (X ) + Var (Y )

Please note : whether we add X and Y or subtract Y from X, we always must add their
variances.

If X,Y are two rvs, we define their covariance

COV (X, Y ) = E[(X − µ 1 )(Y − µ 2 )]

Where µ 1 = E(X ), µ 2 = E(Y )

Th. If X,Y are indep, E(XY ) = E(X )E(Y ) and COV (X, Y ) = 0

100
Sample Mean

Let X 1 , X 2 .....X n be n indep rvs each having the same mean µ and same variance σ 2 .

We define

X 1 + X 2 + ... + X n
X=
n

X is called the mean of the rvs X 1 .....X n . Please note that X is also a rv.

Theorem

1. ()
E X =µ

2. ()
Var X =
σ2
n
.

Proof

(i) ()
EX =
1
n
[E(X 1 ) + E(X 2 ) + .... + E(X n )]

1 µ + µ + ..... + µ
= =µ
n n times

(2) ()
Var X =
1
n2
[Var (X 1 ) + Var(X 2 ) + .... + Var(X n )]

(as the variables are independent)

1 σ 2 + σ 2 + .. + σ 2 nσ 2 σ 2
= = =
n2 n times n2 n

101
Sample variance

Let X 1 ...X n be n indep rvs each having the same mean µ and same variance σ 2 . Let

X1 + X 2 + X n
X= be their sample mean. We define the sample variance as
n

1
(X )
n 2
S2 = i −X
n −1 i =1

Note S 2 is also a r.v.

( )
E S2 =σ 2

Proof. Read it on page 179.

Simulation

To simulate the values taken by a continuous r.v. X, we have to use the following
theorem.

Theorem

Let X be a continuous r.v. with density f(x) and cumulative distribution function F(x). Let
U = F ( X ) . Then U is a r.v. having uniform distribution on (0,1).

In other words, U is a random number. Thus to simulate the value taken by X, we take a
random no U from the table 7 (Now you must put a decimal point before the no) And
solve for X, the equation

F (X ) = U

102
Example 24

Let X have uniform density on (α , β ) . Simulate the values of X using the 3-digit random
numbers.

937, 133, 753, 503, …..

Solution

Since X has uniform density on (α , β ) its density is

1
α <n<β
f (x ) = β −α
0 elsewhere

Thus the cumulative distribution function is

0 x ≤α
F (x ) = x −α
β −α α <x≤β
1 x>β

X −α
F(X ) = means =
β−α

_ _
∴X = α + (β − α )

Hence if = .937, X = α + (β − α ).937

= .133, X = α + (β − α ).133
etc.

Let x have exponential density (with parameter β )

x

f (x ) =
1
β
e β
x>0
0 elsewhere

103
Hence the cumulative distribution function is

0 x≤0
F(x ) = − xp
1− e x>0

Thus solving F(X ) = U, (ie) 1 − e


− βx
= U for X, we get

1
X = β ln
1− U

Since U is a random number implies 1-U is also a random number, we can as well use the
formula

1
X = β ln
U

= −β ln U.

Example 25

X has exponential density with parameter 2. Simulate a few values of X.

Solution

The defining equation for X is

X = −2 ln

Taking 3 digit random numbers form table 7 page 595 row 21 col. 3, we get the random
numbers : 913, 516, 692, 007 etc.

The corresponding X values are :

− 2 ln(.913), − 2 ln(.516), − 2 ln (.692 )..........

104
Example 26

The density of a rv X is given by

f (x ) = x − 1 < x < 1
= 0 elsewhere

Simulate a few values of X.

Solution

First let us find the cumulative distribution function F(x).

Case (i) x ≤ 1 In this case F(x) = 0

Case (ii) − 1 < x ≤ 0.


x x
F (x ) = f (t )dt = t dt
−∞ −1

x
1− x2
= − t dt =
−1
2

Case (iii) 0 < x ≤1

x
In this case F ( x ) = f (t )dt
−∞

−1 0 x
= 0 dt + − t dt + tdt
−∞ −1 0

1 x2 1+ x2
= 0+ + =
2 2 2

105
Case (iv) x>1. In this case F(x) =1

Thus

0 x ≤ −1
F(x ) =
2
1− x
2
−1 < x ≤ 0
1+ x 2
2
0 < x ≤1
1 x >1

To simulate a value for X, we have to solve the equation F(x) = U for X

1
Case (i) 0 ≤U <
2

In this case we use the equation

1− x 2
F(x ) = = U(why ?)
2

∴ X = − 1 − 2 U (why ?)

1
Case (ii) ≤U <1
2

In this case we solve for X, the equation

1+ X 2
F(X ) = =U
2

∴ X = + 2U − 1
Thus the defining conditions are :

1
If 0 ≤ U < , X = − 1 − 2U
2
and
1
If ≤ U < 1, x = + 2U − 1
2

106
Let us consider the 3 digit random numbers on page 594 Row 17 Col. 5

726, 282, 272, 022,…….

1
U = .726 ≥ Thus X = + 2 × .726 − 1 = 0.672
2

1
U = .281 < Thus X = − 21 − 2 × .281 = − 0.662
2

Note : Most of the computers have built in programs which generate random deviates
from important distributions. Especially, we can invoke the random deviates from a
standard normal distribution. You may also want to study how to simulate values from a
standard normal distribution by Box-Muller-Marsaglia method given on page 190 of the
text book.

Example 27

Suppose the no of hours it takes a person to learn how to operate a certain machine is a
random variable having normal distribution with µ = 5.8 and σ = 1.2. Suppose it takes
two person to operate the machine. Simulate the time it takes four pairs of persons to
learn how to operate the machine. That is, for each pair, calculate the maximum of the
two learning times.

Solution

We use Box-Muller-Marsaglia Method to generate pairs of values z1 , z 2 taken by a


standard normal distribution. Then we use the formula

x1 = µ + σz1

x 2 = µ + σz 2

to simulate the time taken by a pair of persons.

(where µ = 5.8, σ = 1.2 )

We start with the random numbers from Table 7


107
Page 593, Row 19, Column 4

729, 016, 672, 823, 375, 556, 424, 854

Note

z 1 = − 2 ln (u 2 ) Cos (2πµ 1 )

z 2 = − 2 ln u 2 Sin (2πu 1 )

The angles are expressed in radians.

U1 U2 Z1 Z2 X1 X2
.729 .016 -0.378 -0.991 5.346 4.611

etc.

Review Exercises

5.108. If the probability density of a r.v. X is given by

f (x ) =
(
k 1− x 2 ) 0 < x <1
0 elsewhere

Find the value of k and the probabilities

(a) P(0.1 < X < 0.2)

(b) P(X > 0.5)

Solution
∞ 1
f ( x )dx = 1 gives w s ( )
k 1 − x 2 dx = 1
−∞ 0

1
or k 1 − =1
3
3
∴k =
2
108
The cumulative distribution function F(x) of X is:

Case (i) x ≤ 0 ∴ F (x ) = 0

x
Case (ii) 0 < x ≤ 1 , F(x ) = ( )
k 1 − t 2 dt
0

3 x3
= x− .
2 3

Case (iii) x > 1. F(x ) = 1

∴P(0.1 < X < 0.2 ) = F(0.2) − F(0.1)

(0.2)2 − (0.2) (0.1) − (0.1)


3 3
3 3
= −
2 3 2 3

P(X < 0.5) = 1 − P(X ≤ 0.5)

(0.5) − (0.5)
3
3
= 1 − F(0.5) = 1 −
2 3

5.113: The burning time X of an experimental rocket is a r.v. having the normal
distribution with µ = 4.76 sec and σ = 0.04 sec . What is the prob that this kind of rocket
will burn
(a) <4.66 Sec
(b) > 4.80 se
(c) anywhere from 4.70 to 4.82 sec?

Solution
X − µ 4.66 − 4.76
(a) P(X < 4.66 ) = P <
σ 0.04
= P(Z < −0.25) = 1 − P(Z < 0.25)

= 1 − F (0.25) = 1 − 0.5987 = 04013

109
X − µ 4.80 − 4.76
(b) P(X > 4.80 ) = P >
σ 0.04

= P(Z > 1) = 1 − F (1) = 1 − 0.8413 = 0.1587

(c) P(4.70 < X < 4.82)

4.70 − 4.76 X − µ 4.82 − 4.76


=P < <
0.04 σ 0.04

= P(− 1.5 < Z < 1.5)


= 2F(1.5) − 1 = 2 × 0.9332 − 1 = 0.8664

5.11 The prob density of the time (in milliseconds) between the emission of beta particles
is a r.v. X having the exponential density

0.25e −0.25 x>0


f (x ) =
0 elsewhere

Find the probability that

(a) The time to observe a particle is more than 200 microseconds (=200x 10-3
milliseconds)
(b) The time to observe a particle is < 10 microseconds

Solution

(a) (
P(> 200 micro sec ) = P X > 200 × 10 −3 milli sec )

= [
0.25e −0.25 x dx = − e − 0.25 x ]

200×10 − 3
−3
200×10

−3
= e −50×10

110
(b) P(X < 10 micro sec onds ) = P X < 10 × 10 −3 ( )
10×10 −3
= [
0.25 e − 0.25 x dx = − e −0.25b ] 10×10 − 3
0
0

−3
= 1 − e − 2.5×10

5.120: If n sales people are employed in a door-to-door selling campaign, the gross sales
volume in thousands of dollars may be regarded as a r.v. having the Gamma distribution
1
with α = 100 n and β = . If the sales costs are $5,000 per salesperson, how many
2
sales persons should be employed to maximize the profit.

Solution

For a Gamma distribution µ = αβ = 50 n . Thus (in thousands of dollars) the “average”


profit when n persons are employed.

= T = 50 n − 5n (5 x 1000 per person is the cost per person)

This is a maximum (using calculus) when n = 25.

5.122: Let the times to breakdown for the processors of a parallel processing machine
have joint density

0.04e −0.2 x −0.2 y x > 0, y > 0


f ( x, y ) =
0 elsewhere

where X is the time for the first processor and Y is the time for the 2nd processor. Find

(a) The marginal distributions and their means


(b) The expected value of the sum of the X and Y.
(c) Verify that the mean of a sum is the sum of the means.

111
Solution

(a) Marginal density of X

∞ ∞
= g (x ) = f (x , y )dy = 0.04e − 0.2 x − 0.2 y dy
y = −∞ y =0


− 0.2 x
= 0 .2 e 0.2e − 0.2 y dy = 0.2e − 0.2 x , x > 0
y=0

(and = 0 if x ≤ 0 )

By symmetry, the marginal distribution of Y is

0.2e −0.2 y y>0


h( y ) =
0 elsewhere

1
Since X (& Y) have exponential distributions (with parameters = 5 ) E(X)
0 .2
= E(Y) = 5.

E since f(x,y) = g (x) h (y), X,Y are independent.

∞ ∞
E(X + Y ) = (x + y ) f (x, y )dydx
−∞ −∞

∞ ∞
= (x + y )(0.04)e −0.2 x −0.2 y dydx
x =0 y = 0

∞ ∞
= x.0.04e − 0.2 x − 0.02 y dydx
x =0 y = 0

112
∞ +∞
+ y × 0.04e − 0.2 x − 0.2 y dydx
x =0 y =0

= 5 + 5 = 10 (verify!)

= E(X ) + E(Y )

5.123: Two random variable are independent and each has binomial distribution with
success prob 0.7 and 2 trials.

(a) Find the joint prob distribution.


(b) Find the prob that the 2nd variable is greater than the first.

Solution

Let X,Y be independent and have Binomial distribution with parameters n = 2, and
p = 0.7 Thus

2
P(X = k ) = (0.7 )k (0.3)2− k k = 0,1,2
k

2
P(Y = r ) = (0.7 )r (0.3)2− r r = 0,1,2
r

∴ P(X = k , Y = r ) = P(X = k )P(Y = r ) as X, Y are independent.

2 2
= (0.7 )k + r (.3)4−(k + r )
k r
0 ≤ k, r ≤ 2

113
(b) P(Y > X )

= P(Y = 2, X = 0 or1) + P(Y = 1, X = 0 )

2 2 2
= (0.7 )2 (.3)0 (0.7 )0 (0.3)2 + (0.7 )1 (0.3)1
2 0 1

2 2
+ (0.7 )1 (0.3)1 (0.7 )0 (0.3)2
1 0

5.124 If X1 has mean – 5, variance 3 while X2 has mean 1 and variance 4, and the two are
independent, find

(a) E(3X 1 + 5X 2 + 2)

(b) Var (3X 1 + 5X 2 + 2)

Ans:

(a) 3 (− 5) + 5(1) + 2 = −8

(b) 9 × 3 + 25 × 4 = 127

114

You might also like