Module-2 (Random Variables)
Module-2 (Random Variables)
- Democritus
Learning Objectives
Definition X���
Example 9.1
Hence the values taken by the random variable X are 1,2,3,4,5,6. These values are
also called the realization of the random variable X.
Example 9.2
Sample Point ω HH HT TH TT
X (ω ) 2 1 1 0
Example 9.3
Assigning rule : Let X denote the sum of the numbers on the faces of dice
then Xij = i + j, Here, i denotes face number on the first die and j
denotes the face number on the second die.
Example 9.5
Let random variable X denote the number of trials needed to get a head. The values
taken by it will be 1, 2, 3, ..
Example 9.6
If X is defined as the height of students in a school ranging between 120 cms and
180 cms, Then the random variable X is {x/120 cms < x < 180 cms } is a continuous
random variable.
Example 9.7
Let the maximum life of electric bulbs is 1500 hrs. Life time of the electric bulb is the
continuous random variables and it is written as X = {x/0 ≤ x ≤ 1500}
(i) p(xi) ≥ 0, 6 i ( non- negative ) and (ii) /i = 1 p ^ xih = 1 then p(x) is known as the
n
Example 9.8
A coin is tossed two times. If X is the number of heads, find the probability mass
function of X.
Solution:
Since the coin is tossed two times, the sample space is S={HH, HT, TH, TT}
If X denotes the numbers of heads, the possible values of X are 0,1,2 with the
following
1
P(X = 0) = P (getting no head) = 4 NOTE
2 1
P(X = 1) = P (getting one head) = 4 = 2 Probabilities are non
1 negative and the
P(X = 2) = P (getting two heads) = 4
total is 1.
The probability distribution of X is
X 0 1 2
1 1 1
p(X = x) 4 2 4
Example 9.9
In example 9.3 the probability mass function of X is given in the following table
X 2 3 4 5 6 7 8 9 10 11 12
1 2 3 4 5 6 5 4 3 2 1
P(x) 36 36 36 36 36 36 36 36 36 36 36
The above table may be called as the probability distribution function of X.
Remark:
(iii) If X is discrete random variable then for any real x, P(X = x) need not be
zero.However in the case of continuous random variable P(X = x) = 0 holds
always. P ] X - ag = # f ]xg dx = 0
a
a
Example 9.10
Solution:
Example 9.11
#- 33 f]xgdx = #0 3 29x dx
9 :2D = 1
2 9
&
Fig.9.5
(iv) F ]- 3g = lim F ]x g = 0
x "-3
(v) F ]3g = lim F ]xg = 1
x"3
= P ]a < X # bg = P ]a # X # bg
from pdf by integrating the pdf over
the given range
f ]xg dx
b
= #a
Properties
(i) F(x) is a non decreasing function of x
(iii) F ]- 3g = 0,
(iv) F ]3g = 1
(v) For any real constant a and b such that a < b, p( a < X ≤ b) = F(b) – F(a)
Example 9.12
X 0 1 2 3 4 5 6
P(X = x) a 3a 5a 7a 9a 11a 13a
(iii) Evaluate : (a) P(X $ 4) (b) P(X < 5) (c) P(3 ≤ X ≤6)
Solution:
a + 3a + 5a + 7a + 9a + 11a + 13a = 1
1
49a = 1 => a = 49
X 0 1 2 3 4 5 6
1 3 5 7 9 11 13
P(x) 49 49 49 49 49 49 49
1 4 9 16 25 36 49
F(x) 49 49 49 49 49 49 49 = 1
(iii) (a) P(X ≥ 4)
= 9a + 11a + 13a
= 33a
1 33
= 33 # 49 = 49
(b) P(X<5) = 1– P (X ≥ 5) = 1– [P (X = 5) + P (X = 6) ]
= 1 – [11a + 13a]
= 1 – 24a
24
= 1 – 29
25
= 29
= 7a + 9a + 11a + 13a
= 40
40
a = 49
Example 9.13
x
f ]xg = * 2
; 0<x<2
0 ; otherwise
(i) Find the c.d.f of X , (ii) Compute P b 2 < X # 1 l ,
1
(iii) Compute P(X=1.5)
Solution:
= 2 b x2 l1
#1 1 2x dx 1 2 1
2 2
3
= 16
(iii) P (X = 1.5) = 0
p (x, y) ≥ 0 6 x, y NOTE
/ x, y p^x, yh = 1 p (x, y) = p (X = x, Y =y)
Example 9.14
There are 10 tickets in a bag which are numbered 1, 2, 3, ...10. Two tickets are
drawn at random one after the other with replacement.
Here, random variable X denotes the number on the first ticket and random variable
Y denotes the number on the second ticket.
Definition:
Let (X, Y) be a bivariate continuous random variables. The function f (x, y) is
called a bivariate probability density if the following conditions are satisfied.
(i) f (x, y) ≥ 0 6 x, y
3 3
(ii) # # f ^ x, y h dx dy = 1
-3 -3
Example 9.15
1 ^x + y h
Prove that the bivariate function given by f(x, y) = * 8
, 0 < x, y # 2
0 , otherwise
Proof:
3 3
If f is a probability density function # # f ^ x, y h dx dy = 1
2 2 -3 -3
1
= 8 # # ^x + yhdx dy
0 0
2 2 2 2
= 8f ydx dy p
1 # # xdx dy + # #
0 0 0 0
2 2 2 2
= 8 6]4 - 0g + ]4 - 0g@
1
1
= 8 #8 = 1
Example 9.16
x 2 y , 0 # x # 1, 0 # y # 2
Joint p.d.f. of X,Y is f(x, y) = * then find the marginal
0 , elsewhere
density function of X and Y.
Solution:
3
f (x) = # f ^ x, y h dy
-3
2
f (x) = # x2 ydy
1
2
y2
= x2 < F = x2 : 2 - 2 D
4 1
2 1
3 3
= x2 # 2 = 2 x2
3
f (y) = # f ^ x, y h dx
-3
1
f (y) = # x2 ydy
0
3 1 y
= y c x3 m = y b 3 l = 3
1
0
3 2
f(x) = * 2
x , 0 # x # 1,
0 , elsewhere
Marginal density function of Y
y
f(x, y) = * 3 , 0 # y # 2
0 , elsewhere
Example 9.17
4 x - 2 y , 0 # x # 3, 0 # y # 2
Joint p.d.f. of X,Y is f(x, y) = ) Find the marginal
0 , elsewhere
density function of X and Y
Solution:
3
f (x) = # f ^ x, y h dy
-3
2
f (x) = # ^4x - 2yhdy
1
2
2y 2
= <4xy - F
2 1
= 64x ]2 - 1g - ]4 - 1g@ = 4x - 3
3
f (y) = # f ^ x, y h dx
-3
3
f (y) = # ^4x - 2yhdx
1
= b 42x - 2xy l
2 3
= _2x2 - 2xy i1
3
= 72 ]9 - 1g - 2y ]3 - 1gA
= 2 # 8 - 6y + 2y
= 16 - 4y
4x - 3 , 1 # x # 3
f(x) = )
0 , elsewhere
16 - 4y , 1 # y # 2
f(x, y) = )
0 , elsewhere
E(X) = x1 p1 + x2 p2 +… + xn pn
= /ni = 1 xi pi where / pi = 1
Sometimes E(X) is known as the mean of the random vairable X.
Result:
If g(X) is a function of the random variable X, then E g(X) = / g ]xg p ] X = xg
Properties:
(i) E(c) = c where c is a constant
Proof :
E(X) = / xi pi
E(c) = / cpi = c / pi = c # 1 = c
(ii) E ]cX g = cE(X), where c is a constant
Proof:
E(X) = / xi pi
E(X) = / cxi pi
= c / xi pi
= c E(X).
Var ( X ) = E (X – E(X))2
= E (X2 ) – ( E(X))2
Example 9.18
When a die is thrown X denotes the number turns up. Find E(X), E(X2) and Var(X).
Solution:
X 1 2 3 4 5 6
P(x) 1 1 1 1 1 1
6 6 6 6 6 6
E (X) = ∑xi pi
E (X) = x1 p1 + x2 p2 + … + x6 p6
1 1 1
E (X) = 1 × 6 + 2 × 6 + … + 6 × 6
1
= 6 (1+2+3+4+5+6)
7
= 2
1
= 6 (1+4+9+16+25+36)
1 91
= 6 (91) = 6
= 6 -b 2 l
91 7 2
91 49
= 6 - 4
35
Var(X) = 12
Example 9.19
The mean and standard deviation of a random variable X are 5 and 4 respectively
Find E (X2)
Solution:
` Var (X) = 16
16 = E (X2) – (5)2
E(X2) = 25 + 16 = 41
Example 9.20
A player tosses two coins, if two head appears he wins ` 4, if one head appears he
wins ` 2, but if two tails appears he loses ` 3. Find the expected sum of money he wins?
Solution:
Probability distribution is
X 4 2 -3
P(X=x) 1 1 1
4 2 4
E (X) = ∑xi pi
E (X) = x1 p1 + x2 p2 + x3 p3
1 1 1
E (X) = 4 × 4 + 2 × 2 –3 × 4 = 1.25
Example 9.21
X –3 6 9
P(X=x) 1 1 1
6 2 3
Find mean and variance
Solution:
= x1 p1 + x2 p2 + x3 p3
1 1 1
= -3× 6 + 6 × 2 + 9 × 3
11
= 2
= b - 32 6 l + b 62 2 l + ... b 92 3 l
# 1 # 1 #1
3
= 2 +18+27
93
= 2
= 2 -b 2 l
93 11 2
93 121
= 2 - 4
186 - 121
= 4
65
= 4
11
Mean = 2 ,
65
Var (X) = 4
Let X be a continuous random variable with probability density function f (x) then
the mathematical expectation of X is defined as
3
E (X) = # xf (x) dx provided the integral exists
-3
3
E [ g(X)] = # g (x) f (x) dx
-3 NOTE
Results:
If g(X) is a function of a
(1) E(c) = c where c is constant random variable and E
[g(X)] exists then
Proof :
3
By definition, E(X) = # xf (x) dx
-3
3 3
E(c ) = # cf (x) dx = c # f (x) dx
-3 -3
3
= c×1=c fa # f (x) dx = 1 p
-3
(2) E (aX) = a E(X)
Proof :
3 3
E (aX) = # axf (x) dx = a # xf (x) dx
-3 -3
= a E (X)
Example 9.22
1 8 3 B2
= 2 x 30
1 8
= 2 [3–0]
4
= 3
3
E(X2) = # x2 f (x) dx
-3
3 NOTE
2b
2 l dx
x
= # x
-3 The following results are true in both
discrete and continuous cases.
1 8x 4 4B0
2
= 2 (i) E ( 1/X) ≠ 1/ E(X)
1
= 8 [16-0] (ii) E [log (X)] ≠ log E (X)
= 2 (iii) (E (X2) ≠ [E(X)]2
So far we have studied how to find mean and variance in the case of single random
variables taken at a time but in some cases we need to calculate the expectation for a linear
combination of random variables like aX + bY or the product of the random variables
cX × dY or involving more number of random variables. So here we see theorems useful
in such situations.
Let the random variable X assumes the values x1, x2 ... xn with corresponding
probabilities p1, p2…..pn, and the random variable y assume the values y1, y2 ……. y m with
corresponding probabilities p1, p2…..pm
By definition,
E(Y) = / y jpj / pj = 1
= /in= 1 xi pi + /mj = 1 y j p j
= E (X) + E (Y)
Proof:
We know that
3
E(X) = # xf (x) dx and
-3
3
E(Y) = # yf (y) dy
-3
3 3
E(X+Y) = # # ^x + y h f ^x, y h dxdy
-3 -3
3 3
xf f ^ x, y h dy p dx + yf f ^ x, y h dx p dy
3 3
= # # # #
-3 -3 -3 -3
3 3
= # xf (x) dx + # yf (y) dy
-3 -3
= E (X) + E (Y)
E (aX + b) = a E (X) ! b
3
E (aX±b) = # (ax ! b) f (x) dx
-3
3 3
=a # xf (x) dx ! b # f (x) dx
-3 -3
3
= a E (X) ! b # 1 as # f (x) dx = 1
-3
= a E ((X) ± b)
Remarks:
1. Statement: E (aX+b) = aE(X)+b where a and b are constants.
= aE(X)+b by property 2
3. Statement: E(X-X ) =0
= X - X =0 X is a constant
Example 9.23
Find the expectation of the sum of the number obtained on throwing two dice.
Solution:
Let X&Y denote the number obtained on the I and II die respectively. Then each
of them is a random variable which takes the value 1,2,3,4,5 and 6 with equal probability
1
6
E(X) = /xi pi
1 1 1
= 1× 6 +2× 6 +.....+6× 6
1 + 2 + 3 + 4 + 5 + 6 21 7
= 6 = 6 =2
Similarly,
7
E(Y) = 2
x 2 3 4 5 6 7 8 9 10 11 12
P(x) 1 2 3 4 5 6 5 4 3 2 1
36 36 36 36 36 36 36 36 36 36 36
1 2 1
E(X+Y) = 2 # 36 + 3 # 36 + ... + 12 # 36
2 + 6 + 12 + 20 + 30 + 42 + 40 + 36 + 30 + 22 + 12
= 36 =7
7 7
E(X)+E(Y) = 2 + 2 = 7
Example 9.24
2y, 0 # x # 1
and g(y)= )
0 otherwise
prove that E(X+Y)= E(X)+E(Y)
Solution:
3 3
E(X+Y) = # # ^x + y h f ^x, y h dxdy
-3 -3
1 1
= # # ^x + yh4xy dxdy
0 0
1 1 1 1
= 4> # #x # # xy2 dx dyH
2
y dx dy +
0 0 0 0
1 1 1 1
= 4> # f #x dx p ydy + #f# y2 dy p xdxH
2
0 0 0 0
1 1
# 1 # 1
= 4[ 3 y dy + 3 x dx ]
0 0
1 1
= 3 > # y dy + # x dxH
4
0 0
4 1 1 4
= 3 [2 + 2] = 3 ... (1)
3
E(X) = # xf (x) dx
-3
1
= # x # 2x dx
0
1
=2 # x2 dx
0
E(X) = 2 : 3 D = 3
1 2
3
E(Y) = # yf (y) dy
-3
1
= # y # 2y dy
0
1
=2 # y2 dy
0
E (Y) = 2 : 3 D = 3
1 2
2 2 4
E(X)+E(Y) = 3 + 3 = 3 ... (2)
From 1&2
E(X+Y) = E(X)+E(Y)
n m
= /i = 1 / j = 1 xi y j pi p j Pij = pi p j
= a /i = 1 xi pi k b / j = 1 yi pi l
n m
= E(X) E(Y)
Proof:
3 3
Now E(XY) = # # xy f ^ x, y h dxdy
-3 -3
3 3
== # xf (x) dx # yf (y) dy
-3 -3
= E (X) E (Y)
Example 9.25
Two coins are tossed one by one. First throw is considered as X and second throw
is considered as Y following joint probability distribution is given by,
X 1 0 Total
Y
0.25 0.25 0.5
1
0.5 0.5 1
Total
Solution:
=0.25
E (X) = Σ x pi
= 1 ×0.5+0×0.5
= 0.5
E (Y) = Σ y pj
= 1 ×0.5+0×0.5 = 0.5
Example 9.26
4by, 0 # y # 1
f (y) = )
0, otherwise
Prove that E(XY) = E(X) E(Y)
Solution:
f(x,y) = 4ax×4by
16abxy, 0 # x, y # 1
f(x,y) = )
0, otherwise
3 3
E(XY) = # # xy f ^ x, y h dxdy
-3 -3
1 1
= # # xy # 16abxydxdy
0 0
1 1
= 16ab # > # x2 dxHy2 dy
0 0
1
# 1 2
= 16ab 3 y dy
0
1 1 16ab
= 16ab× 3 # 3 = 9 →1
3
E(X) = # xf (x) dx
-3
1 1
= # x # 4ax dx = 4a # x2 dx
0 0
1 4a
= 4a ( 3 ) = 3
3
E(Y) = # yf (y) dy
-3
1
= # y # 4by dy
0
1
=4a # y2 dy
0
1 4
=4b ( 3 ) = b
4a 4b
E(X) E(Y) = 3 # 3
16ab
= 9 →2
From 1 and 2,
9.8 Moments
Another approach helpful to find the summary measures for probability distribution
is based on the ‘moments’.We will discuss two types of moments.
(i) Moments about the origin. (Origin may be zero or any other constant say A ).
It is also called as raw moments.
n'1 = E(X) = / xi pi = X
This is called the mean of the random variable X. Hence the first order raw moment
is mean.
Put r = 2 then
n1 = E ( X - X ) = 0 (always)
Remark:
The algebraic sum of the deviations about the arithmetic mean is always 0
= E ( X2 – 2 X # X + X 2 )
= E ( X2) – 2E (X) E ( X ) +E ( X )2
= E(X2) – 2 X X + ( X )2
= E(X2) – ( X )2
n2 = n'2 - (n'1) 2
This is called the 2nd central moment about the mean and is known as the variance
of the random variable X.
For a random variable X to find the moment about origin we use moment generating
function.
E (Xr) = M xr (t) at t=0 M xr (t)
We know that
Mx(t) = E (etX)
tX (tX)
2
(tX) 3 (tX) r
= E [1+ 1! + 2! + 3! + ... + r! + ...]
tr
From the series on the right hand side, nr ' is the coefficient of r! in Mx (t) .
Since Mx (t) generates moments of the distribution and hence it is known as moment
generating function.
Using the function, we can find mean and variance by using the first two raw
moments.
n2 = n'2 - (n'1) 2
Definition:
The characteristic function of a random variable X, denoted by z x (t) , where
z x (t) = E (ei t X) then
3
z x (t) = E (e i tX
)= # eitx f ]xg dx , for continuous random variable
-3
z x (t) = /eitx p (x) , for discrete random variable
where i = -1