Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
21 views

Multiple - Random - Variables

The document discusses multiple random variables, including discrete random variables with joint probability mass functions and their marginal distributions. It also defines concepts like expected value, variance, and covariance of random variables. Several examples are provided to demonstrate calculating these measures from probability tables.

Uploaded by

iha8le
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views

Multiple - Random - Variables

The document discusses multiple random variables, including discrete random variables with joint probability mass functions and their marginal distributions. It also defines concepts like expected value, variance, and covariance of random variables. Several examples are provided to demonstrate calculating these measures from probability tables.

Uploaded by

iha8le
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

Multiple Random Variables

Statistics and Probability, MTH 281

Introduction

Multiple random variables are a fundamental concept in probability theory and statistics,
extending the idea of single random variables to encompass scenarios involving several uncer-
tain quantities. Sets of variables represent them, typically denoted as X1 , X2 , . . . , Xn , where
each Xi is a random variable. Mathematically, these variables can be discrete or continuous
and can have joint probability distributions, allowing us to describe the combined outcomes
of these variables. In this chapter, we will consider the case of two random variables.

1 Multiple discrete random variables

For two discrete random variables X and Y with possible values x1 , x2 , . . . , xm and y1 , y2 , . . . , yn ,
respectively, the joint probability mass function satisfies:

P (X = xi , Y = yj ) = pij for i = 1, 2, . . . , m and j = 1, 2, . . . , n.

1.1 Marginal Probability Mass Function

The marginal probability mass function for X and Y are obtained by summing out the other
variable:

1
1.2 Joint Expected Value
n
X
P (X = xi ) = P (X = xi , Y = yj )
j=1

m
X
P (Y = yj ) = P (X = xi , Y = yj )
i=1

1.2 Joint Expected Value

For X and Y with possible values x1 , x2 , . . . , xm and y1 , y2 , . . . , yn respectively, and joint


probability mass function P (X = xi , Y = yj ) = pij , the joint expected value (E(XY )) is
calculated as:
m X
X n
E(XY ) = xi · yj · P (X = xi , Y = yj )
i=1 j=1

1.3 Covariance

Covariance is a statistical measure that describes the extent to which two random variables
change together. It measures the relationship between two variables, indicating the degree to
which they move in relation to each other. The covariance (Cov(X, Y )) of random variables
X and Y can be calculated using the formula:

Cov(X, Y ) = E(XY ) E(X) · E(Y ).

Properties of Covariance

1. Cov(a, b) = 0 for any constants a and b.

2. Cov(aX, Y ) = a · Cov(X, Y ) for any constant a.

3. Cov(X + c, Y + d) = Cov(X, Y ) for any constants c and d.

2
1.3 Covariance
Example 1

Let X and Y be two discrete random variables with the joint probabilities provided in the
table. Solve the following:

1. Calculate the covariance (Cov(X, Y )).

2. Compute V (X) and V (Y ).

X
1 2 3 P (Y = yi )
Y
1 0.28 0.08 0.04 0.4
2 0.08 0.2 0.32 0.6
P (X = xi ) 0.36 0.28 0.36 1

Solution

1. To calculate the covariance (Cov(X, Y )) between random variables X and Y , we have the
following formula:
Cov(X, Y ) = E(XY ) E(X) · E(Y )

2. **Calculate E(X) and E(Y ):**

E(X) = 1 · 0.36 + 2 · 0.28 + 3 · 0.36 = 2

E(Y ) = 1 · 0.4 + 2 · 0.6 = 1.6

3. **Calculate E(XY ):**

E(XY ) = 1 · 1 · 0.28 + 2 · 1 · 0.08 + 3 · 1 · 0.04 + 1 · 2 · 0.08 + 2 · 2 · 0.2 + 3 · 2 · 0.32 = 2.96

4. **Calculate Cov(X, Y ) using the formula:**

Cov(X, Y ) = 2.96 2 · 1.6 = 0.24.

3
1.3 Covariance
2. To compute the variance of X and Y , (V (X) and V (Y )), we need firstly to find:

E(X 2 ) = 12 · 0.36 + 22 · 0.28 + 32 · 0.36 = 4.72

E(Y 2 ) = 12 · 0.4 + 22 · 0.6 = 2.8.

Now, apply the variance formula for X and Y .

V (X) = E(X 2 ) [E(X)]2 = 4.72 22 = 0.72

V (Y ) = E(Y 2 ) [E(Y )]2 = 2.8 1.62 = 0.24.

Example 2

Let X and Y be discrete random variables with the following joint probability mass function
(pmf): 8
< kxy if x = 1, 2, 3 and y = 1, 2, 3
P (X, Y ) =
: 0 otherwise.

Determine the following:

1. The constant k for the joint probability mass function.

2. The means and variances of the random variables X and Y .

3. Calculate the covariance between X and Y (Cov(X, Y )).

Solution

1. To find the constant k, we use the property that the sum of probabilities over all possible
values must equal 1:
3 X
X 3
P (X = x, Y = y) = 1
x=1 y=1

4
1.3 Covariance
This yields the equation:
3 X
X 3
kxy = 1.
x=1 y=1

Now, solve for k:


3 X
X 3
k xy = 1
x=1 y=1

P3 P3
Since the sum of xy for x = 1, 2, 3 and y = 1, 2, 3 is x=1 y=1 xy = 1 + 2 + 3 + 2 + 4 + 6 +
3 + 6 + 9 = 36, we can solve for k:
k · 36 = 1

1
k= .
36

2. i. Solving E(X) and E(Y ).

3 X
X 3
E(X) = x · P (X = x, Y = y)
x=1 y=1
3 3
1 XX 2 1
= xy= · 84 ⇡ 2.33.
36 x=1 y=1 36

Similarly, the mean of Y (E(Y )) is given by:

3 X
X 3
E(Y ) = x · P (X = x, Y = y)
x=1 y=1
3 3
1 XX 2 1
= xy = · 84 ⇡ 2.33.
36 x=1 y=1 36

ii. Finding V (X) and V (Y ).

V (X) = E(X 2 ) [E(X)]2 .

5
1.3 Covariance
V (Y ) = E(Y 2 ) [E(Y )]2 .

We should find E(X 2 ) and E(Y 2 ) as stated below:

3 X
X 3
2
E(X ) = x2 · P (X = x, Y = y)
x=1 y=1
3 3 3 3
1 XX 2 1 XX 3 216
= x · xy = xy= = 6.
36 x=1 y=1 36 x=1 y=1 36

Similarly,
E(Y 2 ) = 6.

Now, substituting the values,

V (X) = 6 (2.33)2 ⇡ 0.57.

V (Y ) = 6 (2.33)2 ⇡ 0.57.

3. Calculate the covariance between X and Y :

The covariance between X and Y (Cov(X, Y )) is calculated using the formula:

Cov(X, Y ) = E(XY ) E(X) · E(Y )

To find E(XY ), we’ll use the joint probability mass function:

3 X
X 3 3 3
1 XX 2 2
E(XY ) = xy · P (X = x, Y = y) = xy
x=1 y=1
36 x=1 y=1
196
= ⇡ 5.44.
36

Now, substitute the values into the formula:

Cov(X, Y ) = 5.4 (2.33)(2.33) ⇡ 0.

6
1.3 Covariance
Example 3

Consider an experiment of rolling a fair six-sided die twice. Let X be the random variable
representing the outcome of the first roll and Y be the random variable representing the
outcome of the second roll.

Determine the joint probability distribution for X and Y based on the outcomes of the die
rolls.

Solution

Distribution of X and Y: The possible values for both X and Y range from 1 to 6 (the
faces of a die). The joint probability distribution is as follows:

X, Y (1, 1) (1, 2) (1, 3) (1, 4) (1, 5) (1, 6)


1 1 1 1 1 1
P (X = x, Y = y) 36 36 36 36 36 36

X, Y (2, 1) (2, 2) (2, 3) (2, 4) (2, 5) (2, 6)


1 1 1 1 1 1
P (X = x, Y = y) 36 36 36 36 36 36

X, Y (3, 1) (3, 2) (3, 3) (3, 4) (3, 5) (3, 6)


1 1 1 1 1 1
P (X = x, Y = y) 36 36 36 36 36 36

X, Y (4, 1) (4, 2) (4, 3) (4, 4) (4, 5) (4, 6)


1 1 1 1 1 1
P (X = x, Y = y) 36 36 36 36 36 36

X, Y (5, 1) (5, 2) (5, 3) (5, 4) (5, 5) (5, 6)


1 1 1 1 1 1
P (X = x, Y = y) 36 36 36 36 36 36

X, Y (6, 1) (6, 2) (6, 3) (6, 4) (6, 5) (6, 6)


1 1 1 1 1 1
P (X = x, Y = y) 36 36 36 36 36 36

Homework Continue the work of this example and calculate the covariance (Cov(X, Y ))
between the random variables X and Y .

7
2 Multiple continuous random variables

For two continuous random variables X and Y with possible values in their respective inter-
vals, the joint probability density function satisfies:

fXY (x, y) = f (x, y) for all (x, y) in the joint domain.

2.1 Marginal Probability Density Function

The marginal probability density functions for X and Y are obtained by integrating out the
other variable:

Z 1
fX (x) = fXY (x, y) dy
1
Z 1
fY (y) = fXY (x, y) dx
1

2.2 Joint Expected Value

For continuous random variables X and Y with joint probability density function fXY (x, y),
the joint expected value (E(XY )) is calculated as:

Z 1 Z 1
E(XY ) = x · y · fXY (x, y) dy dx
1 1

2.3 Covariance

Covariance measures the relationship between two variables and is calculated as:

Cov(X, Y ) = E(XY ) E(X) · E(Y )

8
2.3 Covariance
Example 1

Let X and Y be continuous random variables with the following joint probability density
function (pdf): 8
< kxy if 0  x  2 and 0  y  1
f (x, y) =
: 0 otherwise.

Determine the following:

1. Find the constant k to normalize the joint pdf.

2. Calculate the means and variances of the random variables X and Y .

3. Compute the covariance between X and Y .

Solution

1. Normalization Constant k:

To normalize the joint pdf, the total volume under the joint probability density function over
the entire valid domain must be equal to 1:

Z 2 Z 1
kxy dy dx = 1
0 0

Solving the integral:


Z 2 Z 1 Z 2  1 Z 2
xy 2 x
kxy dy dx = k dx = k dx = 1
0 0 0 2 0 0 2

 2
x2 4
=k =k· =1
4 0 4

Therefore, the normalization constant k for the joint probability density function is k = 1.

9
2.3 Covariance
2. Means and Variances:
Z 2 Z 1 Z 2  1
2 y2
µX = E(X) = x · xy dy dx = x dx
0 0 0 2 0
Z 2 Z
2 1 1 2 2
= x · dx = x dx
0 2 2 0
 2
1 x3 1 8 4
= = · = .
2 3 0 2 3 3

Z 2 Z 1 Z 2  1
y3
µY = E(Y ) = y · xy dy dx = x dx
0 0 0 3 0
Z 2  2
1 1 x2 1 4 2
= x · dx = = · = .
0 3 3 2 0 3 2 3

To find V (X) and V (Y ), we need to find:


Z 2 Z 1 Z 2  1
2 2 3 y2
E(X ) = x · xy dy dx = x dx
0 0 0 2 0
Z 2  2
13 1 x4 1
= x · dx = = · 4 = 2.
0 2 2 4 0 2

Z 2 Z 1 Z 2  1
2 2 y4
E(Y ) = y · xy dy dx = x dx
0 0 0 4 0
Z 2  2
1 1 x2 1 4 4 1
= x · dx = = · = = .
0 4 4 2 0 4 2 8 2

Now,
✓ ◆2
2 2 4
V (X) = E(X ) [E(X)] = 2 = 0.22.
3
✓ ◆2
1 2
V (Y ) = E(Y 2 ) [E(Y )]2 = = 0.05.
2 3

3. Covariance:
Z 2 Z 1
E(XY ) = xy · kxy dy dx
0 0

10
2.3 Covariance
Z 2 Z 1 Z 2  3 1
y
E(XY ) = xy · xy dy dx = x2 dx
0 0 0 3 0
Z 2 Z
2 1 1 2 2
= x · dx = x dx
0 3 3 0
 2
1 x3 1 8 8
= = · = .
3 3 0 3 3 9

4
Given that E(X) = 3
and E(Y ) = 23 , we have the outcome of the covariance as

8 4 2
Cov(X, Y ) = · = 0.
9 3 3

Example 2

Let X and Y be continuous random variables with the following joint probability density
function (pdf): 8
< kx2 y if 1  x  3 and 0  y  2
f (x, y) =
: 0 otherwise.

Determine the following:

1. Find the constant k.

2. Compute the expected values E(X) and E(Y ), also find the variances V (X) and V (Y )

3. Compute cov(X, Y )

Solution

1. Finding the value of k:


Z 3 Z 2 Z 3  2 Z 3
2 2 y2
kx y dy dx = k x dx = k 2x2 dx = 1
1 0 1 2 0 1

 3
x3 2k 52k
=k·2 = · (27 1) = =1
3 1 3 3
3
Therefore, k = 52
.

11
2.3 Covariance
2. Means and Variances:
Z 3 Z 2 Z 3  2
3 3 3 3 y2
µX = E(X) = x y dy dx = x dx
52 1 0 52 1 2 0
 4 3
6 x 6 120
= = · 20 = ⇡ 2.31.
52 4 1 52 52

Z 3 Z 2 Z 3  2
3 2 2 3 2 y3
µY = E(Y ) = x y dy dx = x dx
52 1 0 52 1 3 0
 3
3 8 x3 3 8 26
= · = · · ⇡ 1.33.
52 3 3 1 52 3 3

To find V (X) and V (Y ), we need to find:


Z 3  2
3
2 4 y2
E(X ) = x dx
52 1 2 0
 3
6 x5 6 242
= = · =⇡ 5.6.
52 5 1 52 5

Z 3 Z 2
23
E(Y ) = x2 y 3 dy dx
52 1 0
Z 3  4 2 Z
3 2 y 12 3 2
= x dx = x dx
52 1 4 0 52 1
 3
12 x3 12 26
= = · = 2.
52 3 1 52 3

Now,
V (X) = E(X 2 ) [µX ]2 = 5.6 (2.31)2 = 0.26.

V (Y ) = E(Y 2 ) [µY ]2 = 2 (1.33)2 = 0.23.

3. Covariance:
Z 3 Z 2 Z 3Z 2
3
2
E(XY ) = xy · kx y dy dx = x3 y 2 dy dx
1 0 52 1 0
Z 3  3 2  4 3
3 y 3 2 x 3 2 120
= x3 dx = · = · · 20 = ⇡ 0.77.
52 1 3 0 52 3 4 1 52 3 156

12
2.3 Covariance
The outcome of the covariance is

Cov(X, Y ) = 0.77 2.31 · 1.33 = 2.3.

Exercise

Let X and Y be continuous random variables with the following joint probability density
function (pdf): 8
< kx2 y 3 if 0  x  1 and 1  y  2
f (x, y) =
: 0 otherwise.

Determine the following:

1. Find the constant k.

2. Compute cov(X, Y ).

13

You might also like