Bivariate Distributions
Bivariate Distributions
Joint Distribution function: Let X and Y be two random variables defined on the same probability space
(𝛺, 𝐴̃, 𝑃[. ]). Then (X, Y) is called a two-dimensional random variable. The joint cumulative distribution
function or joint distribution function of X and Y, denoted by 𝐹𝑋,𝑌 (𝑥, 𝑦), is defined as
It may be observed that the joint distribution function is a function of two variables and its domain is the
xy-plane. Sometimes we write 𝐹𝑋,𝑌 (𝑥, 𝑦) 𝑎𝑠 𝐹(𝑥, 𝑦).
Remark: Any function of two variables which fails to satisfy one of the above three conditions is not a
𝑒 −(𝑥+𝑦) , 𝑥 ≥ 0, 𝑦 ≥ 0
Example 1. Show that the bivariate function: 𝐹(𝑥, 𝑦) = { is not a joint
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒,
distribution function.
Marginal cumulative distribution function: If 𝐹𝑋,𝑌 (𝑥, 𝑦) is the joint cumulative distribution function of
two random variables X and Y, then 𝐹𝑋 (𝑥, 𝑦) 𝑎𝑛𝑑 𝐹𝑌 (𝑥, 𝑦) are called marginal distribution functions of X
𝐹𝑋 (𝑥) = 𝑃[𝑋 ≤ 𝑥] = 𝑃[𝑋 ≤ 𝑥, 𝑌 < ∞] = lim 𝐹𝑋,𝑌 (𝑥, 𝑦) = 𝐹𝑋,𝑌 (𝑥, ∞).
𝑦→∞
𝐹𝑌 (𝑦) = 𝑃[𝑌 ≤ 𝑦] = 𝑃[𝑋 < ∞, 𝑌 ≤ 𝑦] = lim 𝐹𝑋,𝑌 (𝑥, 𝑦) = 𝐹𝑋,𝑌 (∞, 𝑦).
𝑥→∞
Joint Discrete Density function: If X and Y are jointly discrete random variables, then the joint discrete
𝑓𝑋 (𝑥) 𝑎𝑛𝑑 𝑓𝑌 (𝑦) are called marginal density functions of X and Y respectively, are defined as
𝑓𝑋,𝑌 (𝑥, 𝑦) = 𝑃[𝑋 = 𝑥, 𝑌 = 𝑦] = 𝑓𝑋 (𝑥). 𝑓𝑌 (𝑦) = 𝑃[𝑋 = 𝑥]. 𝑃[𝑌 = 𝑦] ∀ (𝑥, 𝑦) 𝑜𝑓 (𝑋, 𝑌)
The joint probability function of X and Y is usually represented in the form of the following table:
Y 𝒚𝟏 𝒚𝟐 … 𝒚𝒋 … 𝒚𝒎 𝒇𝑿 (𝒙)
X
𝒙𝟏 𝑝11 𝑝12 … 𝑝1𝑗 … 𝑝1𝑚 𝑓𝑋 (𝑥1 )
of X and Y.
iii. 𝑓𝑌 (𝑦𝑗 ) = 𝑃[𝑌 = 𝑦𝑗 ] = ∑𝑛𝑖=1 𝑝𝑖𝑗 is the marginal discrete density function of Y.
iv. ∑𝑚 𝑛 𝑛 𝑚
𝑗=1 𝑓𝑌 (𝑦𝑗 ) = ∑𝑖=1 𝑓𝑋 (𝑥𝑖 ) = ∑𝑖=1 ∑𝑗=1 𝑝𝑖𝑗 = 1.
Expectation: Let (X, Y) be a two dimensional discrete random variable with joint discrete density
𝑓𝑋,𝑌 (𝑥, 𝑦). The expectation or expected value of a function 𝑔(𝑋, 𝑌), denoted as 𝐸[𝑔(𝑋, 𝑌)], is defined as
## In particular, 𝐸[𝑋𝑌] = ∑𝑥 ∑𝑦 𝑥𝑦. 𝑓𝑋,𝑌 (𝑥, 𝑦), 𝐸[𝑋] = ∑𝑥 𝑥. 𝑓𝑋 (𝑥) and 𝐸[𝑌] = ∑𝑦 𝑦. 𝑓𝑌 (𝑦).
Covariance: Let X and Y be two random variables defined on the same probability space. The
Correlation coefficient: The correlation coefficient of two random variables X and Y, denoted by 𝜌(𝑋, 𝑌),
𝑐𝑜𝑣(𝑋,𝑌) 𝑐𝑜𝑣(𝑋,𝑌)
is defined as 𝜌(𝑋, 𝑌) = = , where 𝜎𝑋 > 0 and 𝜎𝑌 > 0.
√𝑣𝑎𝑟[𝑋].𝑣𝑎𝑟[𝑌] √𝜎𝑋 .𝜎𝑌
𝜌(𝑋, 𝑌) = 0 𝑜𝑟 𝑐𝑜𝑣(𝑋, 𝑌) = 0
Conditional Expectation: Let (𝑋, 𝑌) be a two dimensional discrete random variable with joint discrete
density 𝑓𝑋,𝑌 (𝑥, 𝑦), then the conditional expectation of 𝑔(𝑋, 𝑌) given 𝑋 = 𝑥, denoted by 𝐸[𝑔(𝑋, 𝑌)|𝑋 = 𝑥],
𝑃[𝑋=𝑥∩𝑌=2]
𝑓𝑋|𝑌 (𝑋 = 𝑥|𝑦 = 2) = 𝑃[𝑌=2]
.
𝑃[𝑋=−1∩𝑌=2] 2𝑎 2
Therefore, 𝑓𝑋|𝑌 (𝑋 = −1|𝑦 = 2) = 𝑃[𝑌=2]
= 5𝑎 = 5.
𝑃[𝑋=0∩𝑌=2] 𝑎 1
𝑓𝑋|𝑌 (𝑋 = 0|𝑦 = 2) = = = .
𝑃[𝑌=2] 5𝑎 5
𝑃[𝑋=1∩𝑌=2] 2𝑎 2
𝑓𝑋|𝑌 (𝑋 = 1|𝑦 = 2) = 𝑃[𝑌=2]
= 5𝑎 = 5.
Example 2. Two tetrahedral with sides numbered 1 to 4 are tossed. Let X denote the number on the
downturned face of the first tetrahedron and Y the larger of the downturned numbers. Find
Solution: (i) From the two tetrahedrons, total number of outputs (First face number, second face number)
are as follows:
(1, 1), (1, 2), (1, 3), (1, 4), (2, 1), (2, 2), (2, 3), (2, 4), (3, 1), (3, 2), (3, 3), (3, 4), (4, 1), (4, 2), (4, 3), (4, 4).
It is given that X denotes the number on the downturned face on the first tetrahedron and Y the larger of
the numbers on the two downturned faces. Then (X, Y) takes the values:
(1, 1), (1, 2), (1, 3), (1, 4), (2, 2), (2, 3), (2, 4), (3, 3), (3, 4), (4, 4).
(x, y) (1, 1) (1, 2) (1, 3) (1, 4) (2, 2) (2, 3) (2, 4) (3, 3) (3, 4) (4, 4)
𝑓𝑋,𝑌 (𝑥, 𝑦) 1/16 1/16 1/16 1/16 2/16 1/16 1/16 3/16 1/16 4/16
Y 1 2 3 4 𝒇𝑿 (𝒙)
X
1 1 1 1 4
1
16 16 16 16 16
2 1 1 4
2 0
16 16 16 16
3 1 4
3 0 0
16 16 16
4 4
4 0 0 0
16 16
1 3 5 7
𝒇𝒀 (𝒚) 1
16 16 16 16
1 3 5 7
𝑓𝑌 (1) = 16 , 𝑓𝑌 (2) = 16 , 𝑓𝑌 (3) = 16 𝑎𝑛𝑑 𝑓𝑌 (4) = 16 .
𝑃[𝑌=1 ∩ 𝑋=2] 0
Now 𝑃[𝑌 = 1|𝑋 = 2] = 𝑃[𝑋=2]
= 4 = 0.
16
2
𝑃[𝑌=2 ∩ 𝑋=2] 16 1
𝑃[𝑌 = 2|𝑋 = 2] = 𝑃[𝑋=2]
= 4 =2.
16
1
𝑃[𝑌=3 ∩ 𝑋=2] 16 1
𝑃[𝑌 = 3|𝑋 = 2] = 𝑃[𝑋=2]
= 4 =4.
16
1
𝑃[𝑌=4 ∩ 𝑋=2] 16 1
𝑃[𝑌 = 4|𝑋 = 2] = = 4 = .
𝑃[𝑋=2] 4
16
𝑃[𝑌=𝑦 ∩ 𝑋=3]
𝑃[𝑌 = 𝑦|𝑋 = 3] = .
𝑃[𝑋=3]
𝑃[𝑌=1 ∩ 𝑋=3] 0
Now 𝑃[𝑌 = 1|𝑋 = 3] = 𝑃[𝑋=3]
= 4 = 0.
16
𝑃[𝑌=2 ∩ 𝑋=3] 0
𝑃[𝑌 = 2|𝑋 = 3] = 𝑃[𝑋=3]
= 4 = 0.
16
3
𝑃[𝑌=3 ∩ 𝑋=3] 16 3
𝑃[𝑌 = 3|𝑋 = 3] = 𝑃[𝑋=3]
= 4 =4.
16
1
𝑃[𝑌=4 ∩ 𝑋=3] 16 1
𝑃[𝑌 = 4|𝑋 = 3] = 𝑃[𝑋=3]
= 4 =4.
16
(vii) Now we have to find 𝜌(𝑋, 𝑌) i.e. the correlation coefficient between X and Y. From the table, we have
4 4 4 4 5
𝐸[𝑋] = ∑𝑥 𝑥. 𝑓𝑋 (𝑥) = 1 × 16 + 2 × 16 + 3 × 16 + 4 × 16 = 2 .
4 4 4 4 15
𝐸[𝑋 2 ] = ∑𝑥 𝑥 2 . 𝑓𝑋 (𝑥) = 1 × 16 + 4 × 16 + 9 × 16 + 16 × 16 = 2
.
1 3 5 7 25
𝐸[𝑌] = ∑𝑦 𝑦. 𝑓𝑌 (𝑦) = 1 × 16 + 2 × 16 + 3 × 16 + 4 × 16 = 8
.
1 3 5 7 85
𝐸[𝑌 2 ] = ∑𝑦 𝑦 2 . 𝑓𝑌 (𝑦) = 1 × 16 + 4 × 16 + 9 × 16 + 16 × 16 = 8
.
1 1 1 1 2 1 1
= (1 × 16 + 2 × 16 + 3 × 16 + 4 × 16) + (2 × 0 + 4 × 16 + 6 × 16 + 8 × 16).
3 1 4
+ (3 × 0 + 6 × 0 + 9 × 16 + 12 × 16) + (4 × 0 + 8 × 0 + 12 × 0 + 16 × 16).
135
𝐸[𝑋𝑌] = 16
.
15 5 2 5
𝑣𝑎𝑟[𝑋] = 𝐸[𝑋 2 ] − (𝐸[𝑋])2 = 2
− (2
) =4.
85 25 2 55
𝑣𝑎𝑟[𝑌] = 𝐸[𝑌 2 ] − (𝐸[𝑌])2 = 8
− ( 8
) = 64 .
135 5 25 5
𝑐𝑜𝑣(𝑋, 𝑌) = 𝐸[𝑋𝑌] − 𝐸[𝑋]. 𝐸[𝑌] = − . = .
16 2 8 8
5
𝑐𝑜𝑣(𝑋,𝑌) 8 2
Hence 𝜌(𝑋, 𝑌) = = =
√𝑣𝑎𝑟[𝑋].𝑣𝑎𝑟[𝑌] 5 55
√ . √11
4 64
Exercises:
2. Consider two tetrahedral with sides numbered 1 to 4. Let X denote the smaller of the two downturned
numbers and Y the larger. (i) Find the joint density function of X and Y, (ii) find 𝑃[𝑋 ≥ 2, 𝑌 ≥ 2], (iii)
Find the mean and variance of X and Y, (iv) Find the conditional distribution of Y given X for each of
the possible values of X, and (v) Find the correlation coefficient of X and Y.
Ans. (i) Hint: 𝑋 ≤ 𝑌 always (ii) 9/16 (iii) 55/64 (iv) 0, 1/5, 2/5, 2/5, 0, 0, 0, 1 and other self (v) 5/11.
3. Three fair coins are tossed. Let X denote the number of heads on the first two coins, and let Y denote
the numbers of tails on the last two coins. (i) Find the joint distribution of X and Y. (ii) Find the
conditional distribution of Y given X = 1. (iii) Find cov (X, Y), and (iv) Find correlation coefficient of
Joint Continuous Density function: A two dimensional random variable (X, Y) is said to be continuous
if and only if there exists a function 𝑓𝑋,𝑌 (𝑥, 𝑦) ≥ 0 𝑜𝑟 𝑠𝑖𝑚𝑝𝑙𝑦 𝑓(𝑥, 𝑦) ≥ 0 such that
𝑥 𝑦
𝐹𝑋,𝑌 (𝑥, 𝑦) = ∫−∞ ∫−∞ 𝑓𝑋,𝑌 (𝑢, 𝑣) 𝑑𝑢𝑑𝑣 for all (𝑥, 𝑦) ∈ 𝑅 2
The function 𝐹𝑋,𝑌 (𝑥, 𝑦) = 𝑃[𝑋 ≤ 𝑥, 𝑌 ≤ 𝑦] is joint distribution function of X and Y. The function
𝑓𝑋,𝑌 (𝑥, 𝑦) is joint probability density function of X and Y which satisfies the following properties:
𝑓𝑋 (𝑥) 𝑎𝑛𝑑 𝑓𝑌 (𝑦) are called marginal density functions of X and Y respectively, are defined as
∞ ∞
𝑓𝑋 (𝑥) = ∫−∞ 𝑓𝑋,𝑌 (𝑥, 𝑦) 𝑑𝑦 𝑎𝑛𝑑 𝑓𝑌 (𝑦) = ∫−∞ 𝑓𝑋,𝑌 (𝑥, 𝑦) 𝑑𝑥.
𝑓𝑋,𝑌 (𝑥,𝑦)
𝑓𝑌|𝑋 (𝑦|𝑥) = 𝑓𝑋 (𝑥)
, 𝑖𝑓 𝑓𝑋 (𝑥) > 0.
𝑓𝑋,𝑌 (𝑥, 𝑦)
𝑓𝑋|𝑌 (𝑥|𝑦) = , 𝑖𝑓 𝑓𝑌 (𝑦) > 0
𝑓𝑌 (𝑦)
Expectation: Let (X, Y) be a two dimensional continuous random variables with joint probability density
function 𝑓𝑋,𝑌 (𝑥, 𝑦). The expectation or expected value of a function 𝑔(𝑋, 𝑌), denoted as 𝐸[𝑔(𝑋, 𝑌)], is
defined as
∞ ∞
𝐸[𝑔(𝑋, 𝑌)] = ∫−∞ ∫−∞ 𝑔(𝑥, 𝑦). 𝑓𝑋,𝑌 (𝑥, 𝑦) 𝑑𝑥𝑑𝑦.
∞ ∞ ∞ ∞
## In particular, 𝐸[𝑋𝑌] = ∫−∞ ∫−∞ 𝑥𝑦. 𝑓𝑋,𝑌 (𝑥, 𝑦) 𝑑𝑥𝑑𝑦, 𝐸[𝑋] = ∫−∞ ∫−∞ 𝑥. 𝑓𝑋,𝑌 (𝑥, 𝑦) 𝑑𝑥𝑑𝑦 and
∞ ∞
𝐸[𝑌] = ∫−∞ ∫−∞ 𝑦. 𝑓𝑋,𝑌 (𝑥, 𝑦) 𝑑𝑥𝑑𝑦.
Conditional Expectation: Let (𝑋, 𝑌) be a two dimensional continuous random variable with joint
continuous density function 𝑓𝑋,𝑌 (𝑥, 𝑦), then the conditional expectation of 𝑔(𝑋, 𝑌) given 𝑋 = 𝑥, denoted
∞
by 𝐸[𝑔(𝑋, 𝑌)|𝑋 = 𝑥], is defined as 𝐸[𝑔(𝑋, 𝑌)|𝑋 = 𝑥] = ∫−∞ 𝑔(𝑥, 𝑦). 𝑓𝑌|𝑋 (𝑦|𝑥) 𝑑𝑦.
∞
In particular, 𝐸[𝑌|𝑋 = 𝑥] = ∫−∞ 𝑦. 𝑓𝑌|𝑋 (𝑦|𝑥) 𝑑𝑦.
Regression curve:
Example 1. Find 𝑘 so that 𝑓(𝑥, 𝑦) = 𝑘𝑥𝑦, 1 ≤ 𝑥 ≤ 𝑦 ≤ 2 will be a joint probability density function.
∞ ∞ 2 2 2 1
Solution: We have 1 = ∫−∞ ∫−∞ 𝑓(𝑥, 𝑦) 𝑑𝑥𝑑𝑦 = ∫1 ∫𝑥 𝑘𝑥𝑦 𝑑𝑦𝑑𝑥 = ∫1 𝑥. 2 (4 − 𝑥 2 ) 𝑑𝑥 = 9𝑘/8
⟹ 𝑘 = 8/9.
(𝑥 + 𝑦), 0 ≤ 𝑥 ≤ 1, 0 ≤ 𝑦 ≤ 1
Example 2. Let the joint p.d.f. of X and Y be 𝑓(𝑥, 𝑦) = {
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒.
1 1
Find (i) 𝑃 [0 < 𝑋 < 2 , 0 < 𝑌 < 4], (ii) 𝐸[𝑋], 𝐸[𝑌], 𝐸[𝑋𝑌] 𝑎𝑛𝑑 𝐸[𝑋 + 𝑌] (iii) 𝜎(𝑋, 𝑌).
1 1
1 1
Solution: We have 𝑃 [0 < 𝑋 < 2 , 0 < 𝑌 < 4] = ∫02 [∫04(𝑥 + 𝑦) 𝑑𝑦] 𝑑𝑥
∞ ∞ 1 1 1 1 7
Now 𝐸[𝑋] = ∫−∞ ∫−∞ 𝑥. 𝑓(𝑥, 𝑦) 𝑑𝑥𝑑𝑦 = ∫0 ∫0 𝑥. (𝑥 + 𝑦) 𝑑𝑥𝑑𝑦 = ∫0 (𝑥 2 + 2 𝑥) 𝑑𝑥 = 12 .
∞ ∞ 1 1 7
Similarly, 𝐸[𝑌] = ∫−∞ ∫−∞ 𝑦. 𝑓(𝑥, 𝑦) 𝑑𝑥𝑑𝑦 = ∫0 ∫0 𝑦. (𝑥 + 𝑦) 𝑑𝑥𝑑𝑦 = 12 .
∞ ∞ 1 1 1
Now 𝐸[𝑋𝑌] = ∫−∞ ∫−∞ 𝑥𝑦. 𝑓(𝑥, 𝑦) 𝑑𝑥𝑑𝑦 = ∫0 ∫0 𝑥𝑦. (𝑥 + 𝑦) 𝑑𝑥𝑑𝑦 = .
3
∞ ∞ 1 1
7
𝐸[𝑋 + 𝑌] = ∫ ∫ (𝑥 + 𝑦). 𝑓(𝑥, 𝑦) 𝑑𝑥𝑑𝑦 = ∫ ∫(𝑥 + 𝑦). (𝑥 + 𝑦) 𝑑𝑥𝑑𝑦 = .
6
−∞ −∞ 0 0
1
We have 𝑐𝑜𝑣[𝑋, 𝑌] = 𝐸[𝑋𝑌] − 𝐸[𝑋]. 𝐸[𝑌] = − 144 .
∞ ∞ 1 1
2]
5
𝐸[𝑋 = ∫ ∫ 𝑥 . 𝑓(𝑥, 𝑦) 𝑑𝑥𝑑𝑦 = ∫ ∫ 𝑥 2 . (𝑥 + 𝑦) 𝑑𝑥𝑑𝑦 =
2
.
12
−∞ −∞ 0 0
∞ ∞ 1 1
5
𝐸[𝑌 2 ] = ∫ ∫ 𝑦 2 . 𝑓(𝑥, 𝑦) 𝑑𝑥𝑑𝑦 = ∫ ∫ 𝑦 2 . (𝑥 + 𝑦) 𝑑𝑥𝑑𝑦 = .
12
−∞ −∞ 0 0
5 7 2 11
Now 𝑣𝑎𝑟[𝑋] = 𝐸[𝑋 2 ] − (𝐸[𝑋])2 = 12 − (12) = 144 .
11
𝑣𝑎𝑟[𝑌] = 𝐸[𝑌 2 ] − (𝐸[𝑌])2 = 144 .
1
𝑐𝑜𝑣(𝑋,𝑌) − 1
144
Hence 𝜌(𝑋, 𝑌) = = =− .
√𝑣𝑎𝑟[𝑋].𝑣𝑎𝑟[𝑌] 11 11 11
√ .
144 144
Find (i) 𝐸[𝑌|𝑋 = 𝑥], (ii) 𝐸[𝑋𝑌|𝑋 = 𝑥], (iii) 𝑣𝑎𝑟[𝑌|𝑋 = 𝑥] and (iv) regression curve of Y on 𝑋.
∞ 𝑦
𝑎𝑛𝑑 𝑓𝑌 (𝑦) = ∫−∞ 𝑓𝑋,𝑌 (𝑥, 𝑦) 𝑑𝑥 = ∫0 8𝑥𝑦 𝑑𝑥 = 4𝑦 3 , 0 < 𝑦 < 1.
We note that 𝑓(𝑥, 𝑦) ≠ 𝑓𝑋 (𝑥). 𝑓𝑌 (𝑦). Thus X and Y are not independent.
1 1 2𝑦 2(1−𝑥 3 )
Therefore, 𝐸[𝑌|𝑋 = 𝑥] = ∫𝑦=𝑥 𝑦. 𝑓𝑌|𝑋 (𝑦|𝑥) 𝑑𝑦 = ∫𝑦=𝑥 𝑦. (1−𝑥2 ) 𝑑𝑦 =
3(1−𝑥 2 )
2(1+𝑥+𝑥 2 )
= 3(1+𝑥)
, 𝑓𝑜𝑟 0 < 𝑥 < 1.
1 1
2𝑦
𝐸[𝑌 2 |𝑋 = 𝑥] = ∫ 𝑦 2 . 𝑓𝑌|𝑋 (𝑦|𝑥) 𝑑𝑦 = ∫ 𝑦2. 𝑑𝑦
𝑦=𝑥 𝑦=𝑥 (1 − 𝑥 2 )
(1+𝑥 2 )
= , 𝑓𝑜𝑟 0 < 𝑥 < 1.
2
1 1 2𝑦 1 2𝑦
Further, 𝐸[𝑋𝑌|𝑋 = 𝑥] = ∫𝑦=𝑥 𝑥𝑦. 𝑓𝑌|𝑋 (𝑦|𝑥) 𝑑𝑦 = ∫𝑦=𝑥 𝑥𝑦. (1−𝑥2 ) 𝑑𝑦 = 𝑥 ∫𝑦=𝑥 𝑦. (1−𝑥2 ) 𝑑𝑦
2𝑥(1+𝑥+𝑥 2 )
= 𝑥. 𝐸[𝑌|𝑋 = 𝑥] = 3(1+𝑥)
, 𝑓𝑜𝑟 0 < 𝑥 < 1.
2(1+𝑥+𝑥 2 )
i.e. 𝑦 = 3(1+𝑥)
, 𝑓𝑜𝑟 0 < 𝑥 < 1.
Exercises:
1. Find 𝑘 so that 𝑓(𝑥, 𝑦) = 𝑘(𝑥 + 𝑦), 0 < 𝑥 < 1, 0 < 𝑦 < 1 is a joint probability density function.
Ans. 1
𝑥 + 𝑦, 𝑤ℎ𝑒𝑟𝑒 0 ≤ 𝑥 ≤ 1, 0 ≤ 𝑦 ≤ 1
𝑓(𝑥, 𝑦) = {
0, 𝑜𝑡ℎ𝑒𝑟𝑤𝑖𝑠𝑒.
3. Let the two dimensional random variable (X, Y) have the joint density
1
𝑓(𝑥, 𝑦) = 8 (6 − 𝑥 − 𝑦)𝐼(0,2) (𝑥). 𝐼(2,4) (𝑦).
Find (i) 𝐸[𝑌|𝑋 = 𝑥]. (ii) 𝐸[𝑌 2 |𝑋 = 𝑥]. (iii) 𝑣𝑎𝑟[𝑌|𝑋 = 𝑥]. (iv) show that 𝐸[𝑌] = 𝐸[𝐸[𝑌|𝑋]].
𝑥(26−9𝑥)
(v) 3(3−𝑥)
, 0 < 𝑥 < 2.