Multiple - Random - Variables
Multiple - Random - Variables
Introduction
Multiple random variables are a fundamental concept in probability theory and statistics,
extending the idea of single random variables to encompass scenarios involving several uncer-
tain quantities. Sets of variables represent them, typically denoted as X1 , X2 , . . . , Xn , where
each Xi is a random variable. Mathematically, these variables can be discrete or continuous
and can have joint probability distributions, allowing us to describe the combined outcomes
of these variables. In this chapter, we will consider the case of two random variables.
For two discrete random variables X and Y with possible values x1 , x2 , . . . , xm and y1 , y2 , . . . , yn ,
respectively, the joint probability mass function satisfies:
The marginal probability mass function for X and Y are obtained by summing out the other
variable:
1
1.2 Joint Expected Value
n
X
P (X = xi ) = P (X = xi , Y = yj )
j=1
m
X
P (Y = yj ) = P (X = xi , Y = yj )
i=1
1.3 Covariance
Covariance is a statistical measure that describes the extent to which two random variables
change together. It measures the relationship between two variables, indicating the degree to
which they move in relation to each other. The covariance (Cov(X, Y )) of random variables
X and Y can be calculated using the formula:
Properties of Covariance
2
1.3 Covariance
Example 1
Let X and Y be two discrete random variables with the joint probabilities provided in the
table. Solve the following:
X
1 2 3 P (Y = yi )
Y
1 0.28 0.08 0.04 0.4
2 0.08 0.2 0.32 0.6
P (X = xi ) 0.36 0.28 0.36 1
Solution
1. To calculate the covariance (Cov(X, Y )) between random variables X and Y , we have the
following formula:
Cov(X, Y ) = E(XY ) E(X) · E(Y )
3
1.3 Covariance
2. To compute the variance of X and Y , (V (X) and V (Y )), we need firstly to find:
Example 2
Let X and Y be discrete random variables with the following joint probability mass function
(pmf): 8
< kxy if x = 1, 2, 3 and y = 1, 2, 3
P (X, Y ) =
: 0 otherwise.
Solution
1. To find the constant k, we use the property that the sum of probabilities over all possible
values must equal 1:
3 X
X 3
P (X = x, Y = y) = 1
x=1 y=1
4
1.3 Covariance
This yields the equation:
3 X
X 3
kxy = 1.
x=1 y=1
P3 P3
Since the sum of xy for x = 1, 2, 3 and y = 1, 2, 3 is x=1 y=1 xy = 1 + 2 + 3 + 2 + 4 + 6 +
3 + 6 + 9 = 36, we can solve for k:
k · 36 = 1
1
k= .
36
3 X
X 3
E(X) = x · P (X = x, Y = y)
x=1 y=1
3 3
1 XX 2 1
= xy= · 84 ⇡ 2.33.
36 x=1 y=1 36
3 X
X 3
E(Y ) = x · P (X = x, Y = y)
x=1 y=1
3 3
1 XX 2 1
= xy = · 84 ⇡ 2.33.
36 x=1 y=1 36
5
1.3 Covariance
V (Y ) = E(Y 2 ) [E(Y )]2 .
3 X
X 3
2
E(X ) = x2 · P (X = x, Y = y)
x=1 y=1
3 3 3 3
1 XX 2 1 XX 3 216
= x · xy = xy= = 6.
36 x=1 y=1 36 x=1 y=1 36
Similarly,
E(Y 2 ) = 6.
V (Y ) = 6 (2.33)2 ⇡ 0.57.
3 X
X 3 3 3
1 XX 2 2
E(XY ) = xy · P (X = x, Y = y) = xy
x=1 y=1
36 x=1 y=1
196
= ⇡ 5.44.
36
6
1.3 Covariance
Example 3
Consider an experiment of rolling a fair six-sided die twice. Let X be the random variable
representing the outcome of the first roll and Y be the random variable representing the
outcome of the second roll.
Determine the joint probability distribution for X and Y based on the outcomes of the die
rolls.
Solution
Distribution of X and Y: The possible values for both X and Y range from 1 to 6 (the
faces of a die). The joint probability distribution is as follows:
Homework Continue the work of this example and calculate the covariance (Cov(X, Y ))
between the random variables X and Y .
7
2 Multiple continuous random variables
For two continuous random variables X and Y with possible values in their respective inter-
vals, the joint probability density function satisfies:
The marginal probability density functions for X and Y are obtained by integrating out the
other variable:
Z 1
fX (x) = fXY (x, y) dy
1
Z 1
fY (y) = fXY (x, y) dx
1
For continuous random variables X and Y with joint probability density function fXY (x, y),
the joint expected value (E(XY )) is calculated as:
Z 1 Z 1
E(XY ) = x · y · fXY (x, y) dy dx
1 1
2.3 Covariance
Covariance measures the relationship between two variables and is calculated as:
8
2.3 Covariance
Example 1
Let X and Y be continuous random variables with the following joint probability density
function (pdf): 8
< kxy if 0 x 2 and 0 y 1
f (x, y) =
: 0 otherwise.
Solution
1. Normalization Constant k:
To normalize the joint pdf, the total volume under the joint probability density function over
the entire valid domain must be equal to 1:
Z 2 Z 1
kxy dy dx = 1
0 0
2
x2 4
=k =k· =1
4 0 4
Therefore, the normalization constant k for the joint probability density function is k = 1.
9
2.3 Covariance
2. Means and Variances:
Z 2 Z 1 Z 2 1
2 y2
µX = E(X) = x · xy dy dx = x dx
0 0 0 2 0
Z 2 Z
2 1 1 2 2
= x · dx = x dx
0 2 2 0
2
1 x3 1 8 4
= = · = .
2 3 0 2 3 3
Z 2 Z 1 Z 2 1
y3
µY = E(Y ) = y · xy dy dx = x dx
0 0 0 3 0
Z 2 2
1 1 x2 1 4 2
= x · dx = = · = .
0 3 3 2 0 3 2 3
Z 2 Z 1 Z 2 1
2 2 y4
E(Y ) = y · xy dy dx = x dx
0 0 0 4 0
Z 2 2
1 1 x2 1 4 4 1
= x · dx = = · = = .
0 4 4 2 0 4 2 8 2
Now,
✓ ◆2
2 2 4
V (X) = E(X ) [E(X)] = 2 = 0.22.
3
✓ ◆2
1 2
V (Y ) = E(Y 2 ) [E(Y )]2 = = 0.05.
2 3
3. Covariance:
Z 2 Z 1
E(XY ) = xy · kxy dy dx
0 0
10
2.3 Covariance
Z 2 Z 1 Z 2 3 1
y
E(XY ) = xy · xy dy dx = x2 dx
0 0 0 3 0
Z 2 Z
2 1 1 2 2
= x · dx = x dx
0 3 3 0
2
1 x3 1 8 8
= = · = .
3 3 0 3 3 9
4
Given that E(X) = 3
and E(Y ) = 23 , we have the outcome of the covariance as
8 4 2
Cov(X, Y ) = · = 0.
9 3 3
Example 2
Let X and Y be continuous random variables with the following joint probability density
function (pdf): 8
< kx2 y if 1 x 3 and 0 y 2
f (x, y) =
: 0 otherwise.
2. Compute the expected values E(X) and E(Y ), also find the variances V (X) and V (Y )
3. Compute cov(X, Y )
Solution
3
x3 2k 52k
=k·2 = · (27 1) = =1
3 1 3 3
3
Therefore, k = 52
.
11
2.3 Covariance
2. Means and Variances:
Z 3 Z 2 Z 3 2
3 3 3 3 y2
µX = E(X) = x y dy dx = x dx
52 1 0 52 1 2 0
4 3
6 x 6 120
= = · 20 = ⇡ 2.31.
52 4 1 52 52
Z 3 Z 2 Z 3 2
3 2 2 3 2 y3
µY = E(Y ) = x y dy dx = x dx
52 1 0 52 1 3 0
3
3 8 x3 3 8 26
= · = · · ⇡ 1.33.
52 3 3 1 52 3 3
Z 3 Z 2
23
E(Y ) = x2 y 3 dy dx
52 1 0
Z 3 4 2 Z
3 2 y 12 3 2
= x dx = x dx
52 1 4 0 52 1
3
12 x3 12 26
= = · = 2.
52 3 1 52 3
Now,
V (X) = E(X 2 ) [µX ]2 = 5.6 (2.31)2 = 0.26.
3. Covariance:
Z 3 Z 2 Z 3Z 2
3
2
E(XY ) = xy · kx y dy dx = x3 y 2 dy dx
1 0 52 1 0
Z 3 3 2 4 3
3 y 3 2 x 3 2 120
= x3 dx = · = · · 20 = ⇡ 0.77.
52 1 3 0 52 3 4 1 52 3 156
12
2.3 Covariance
The outcome of the covariance is
Cov(X, Y ) = 0.77 2.31 · 1.33 = 2.3.
Exercise
Let X and Y be continuous random variables with the following joint probability density
function (pdf): 8
< kx2 y 3 if 0 x 1 and 1 y 2
f (x, y) =
: 0 otherwise.
2. Compute cov(X, Y ).
13