Lecture12 Slides
Lecture12 Slides
In the discrete case, f(x, y) = P(X = x and Y = y); that is, the values f(x, y) give the probability that
outcomes x and y occur at the same time.
Definition: The function f(x, y) is a joint probability distribution or joint probability mass
function of the discrete random variables X and Y if
1. f(x, y) ≥ 0 for all (x, y),
2. ∑∑ f ( x, y ) = 1,
x y
=g ( x) ∑=
f ( x, y ) and h( y ) ∑ f ( x, y )
y x
x Row Totals
f(x, y)
0 1 2
0 0 c 2c 3c (y = 0)
1 c 2c 3c 6c (y = 1)
y
2 2c 3c 4c 9c (y = 2)
3 3c 4c 5c 12c (y = 3)
6c 10c 14c ∑∑ f ( x, y ) = 30c
Column Totals
(x= 0) (x= 1) (x= 2) x y
2 3
1
Solution: (a) ∑∑
x y
f ( x, y ) =1 ⇒ ∑∑ c( x + y ) =1 ⇒ c =
=
x 0=
y 0 30
15
(b) P( X ≥ 1, Y ≤ 2) =
30
3
1
(c) g ( x=
) ∑ f ( x, y=) ∑ c( x + y=)
y y=0 15
(2 x + 3), =x 0,1,2
2
1
) ∑ f ( x, y=
h ( y= ) ∑ c ( x + y=
) ( y + 1), y= 0,1,2,3
x x=0 10
(d) f(0,0) ≠ g(0) h(0)
6 3
0 ⇒ X and Y are not independent.
30 30
Also, independence of the two random variables X and Y can be checked as follows:
1 1 1
( x + y) (2 x + 3) ( y + 1) ⇒ X and Y are not independent.
30 15 10
Similarly, we define
E(Y) = ∑∑ yf ( x, y ) = ∑ yh( y )
x y x
(discrete case), where h(y) is the marginal distribution Y.
Definition: Let X and Y be random variables with joint probability distribution f(x, y). The
covariance of X and Y is
σ XY = E[( X − µ X )(Y − µY )] = ∑∑ ( x − µ
x y
X
)( y − µY ) f ( x, y )
Theorem: The covariance of two random variables X and Y with means µ X and µY , respectively,
is given by
Cov(X, Y) = σ XY = E ( XY ) − µ X µY = E ( XY ) − E ( X ) E (Y ).
• Cov(X, Y) = Cov(Y, X)
• Cov(X, c) = 0
• Cov(X, X) = Var (X)
• Cov(aX, bY) = ab Cov(X, Y)
• Cov(X ± a, Y ± b) = Cov(X, Y)
• σ= 2
aX + c
a=2
σ X2 a 2σ 2
• σ X2= +c
σ= 2
X
σ2
• =σ aX2 a= 2
σ X2 a 2σ 2
If X and Y are independent random variables, then
σ=2
aX ± bY
a 2σ X2 + b 2σ Y2 .
Theorem: Let X and Y be two independent random variables. Then E(XY) = E(X)E(Y).
σ XY
ρ XY = . −1 ≤ ρ XY ≤ 1
σ Xσ Y
=σX E( X 2 ) − E 2 ( X ) σ XY E ( XY ) − E ( X ) E (Y )
= =σy E (Y 2 ) − E 2 (Y )
• ρ ( X , Y ) = ρ (Y , X )
• ρ(X , X ) = 1
• ρ (aX , bY ) = ρ ( X , Y )
• ρ ( X ± a, Y ± b) = ρ ( X ,Y )
Example 12.1: Two ballpoint pens are selected at random from a box that contains 3 blue pens, 2
red pens, and 3 green pens. If X is the number of blue pens selected and Y is the
number of red pens selected, find
(a) the joint probability function f(x, y),
(b) P[(X, Y) ∈ A], where A is the region {(x, y)|x + y ≤ 1}.
(c) g(x) and h(y) then check independence of the two random variables X and Y.
(d) the correlation coefficient between X and Y.
Solution: The possible pairs of values (x, y) are (0, 0), (0, 1), (1, 0), (1, 1), (0, 2), and (2, 0).
x 0 1 2 y 0 1 2
5 15 3 15 3 1
g(x) h(y)
14 28 28 28 7 28
From the table, we find the three probabilities f(0, 1), g(0), and h(1) to be
3
f(0, 1) = ,
14
2
3 3 1 5
g (0) = ∑ f (0, y ) = + + = ,
y=0 28 14 28 14
2
3 3 3
h(1) = ∑ f ( x,1) = + +0= .
x=0 14 14 7
Clearly, f(0, 1) ≠ g(0)h(1),
2
5 15 3 3
(d) E ( X ) == µ X ∑ xg ( x) = (0)( ) + (1)( ) + (2)( ) = ,
x=0 14 28 28 4
and
2
15 3 1 1
E (Y ) = µY = ∑
y=0
yh( y ) =
(0)( ) + (1)( ) + (2)( ) =.
28 7 28 2
5 15 3 27
E ( X 2 ) = (02 ) + (12 ) + (22 ) =
14 28 28 28
and
5 3 1 4
E (Y 2 ) = (02 ) + (12 ) + (22 ) = ,
28
7
28 7
2 2
27 3 45 4 1 9
⇒ σ = − = and σ Y2 =− =.
2
28 4 112 7 2 28
X
2 2
E(XY) = ∑∑ xy f ( x, y )
=
x 0=
y 0
3 3 1 9
σ XY =
E ( XY ) − µ X µY =− ( )( ) =
− .
14 4 2 56
Therefore, the correlation coefficient between X and Y is
9
σ XY − 1
ρ XY = = 56 = − .
σ Xσ Y 45 9 5
( )( )
112 28
=
Example 12.14: If X and Y are random variables with variances σ X2 2=
and σ Y2 4 and
covariance σ XY = −2 , find the variance of the random variable Z = 3X − 4Y + 8.
Solution:
=σ Z2 σ=2
3 X − 4Y + 8
σ 32X − 4Y
=9σ X2 + 16σ Y2 − 24σ XY
= 9(2) + 16(4) − 24(−2)= 130.
Solution:
=σ Z2 σ=
2
3 X − 2Y + 5
σ 32X − 2Y
= 9σ X2 + 4σ Y2
= 9(2) + 4(3) = 30.