chapter4slide(1)
chapter4slide(1)
STAT2001
2021 Term I
Outline
1. Continuous random variables
2. Expectations
3. The Uniform distribution and Exponential distribution
4. The Gamma distribution and Chi-square distribution
5. The Normal distribution
(Textbook chapters: 3.1 - 3.3)
C.d.fofdiscrete.fi
1. Continuous random variables: c.d.f. and p.d.f.
洋
random variable X is:
:) slopeatbisste.gr
Mrcnkkbtn
到啊
⼀、
ilbton ) ⼀
版"
(2 0火
)
nf-pcaacc.bz
dnsi
道 t y. h hfGDfy-fi .t n area
c.d.f. and p.d.f. 3
3. The area bounded by the function and the x-axis equals to 1. i.e.
Z 1
f (x)dx = 1
1
ǖfntt
Remarks:
1, The p.d.f. f (x) has no direct probability interpretation (In particular, f (x) can be
-
greater than 1). However, integrating f (x) over an interval [a, b] gives the probability
that X falls in that interval.
2, The support of a continuous random variable X is a set that contains all points
x such that f (x) > 0.
3, By Fundamental Theorem of Calculus, for x values that the derivative F 0(x)
_
exists, we have
1!!
吣 :
"
0
f (x) = F (x).
F (x ) =1 的
'
Example 1 4
ifiiiiiffeg
2
⇡x
P (X x) = = x2 .
_
⇡ _
.
_
Hence, c.d.f. of X is:
0 x < 0,
F (x) = x2 0 x < 1,
1 1 x.
lffwn-C-E.i.to
The p.d.f. of X is
f (x) = 2x 0 x < 1,
0 otherwise.
pcfhmo.jifndn.o.co
Example 2: For a continuous random variable X with the following p.d.f.,
if ⼼ 0
10
f (x) = 2 , x > 10.
x
R1 R 1 10
We can check that 1 f (x)dx = 10 x2 dx = 10
10 = 1.
-_-#
In particular, we call ⇡0.5 the Median, ⇡0.25 the First Quartile and ⇡0.75 the Third
-_-
Quartile.
⼀
Fci
Example: If c.d.f. of a random variable is F (x) = 1 exp( x), x > 0,
缤
F (⇡0.5) = 1 exp( ⇡0.5) = 0.5
ln(0.5)
=) M edian = ⇡0.5 =
Flīlpkp
PCXE 们 :P
2. Expectations 7
yrdei ENF
忘中
管
1. Expected value or Mean of X
-
Z 1 n.fi
µ = E (X) =
☐ 1
xf (x)dx.
Z 1 _
3. Variance of X
h i Z 1
2
2
= V ar(X) = E (X µ) = (x µ)2 f (x)dx.
1
p
4. Standard deviation of X is simply = 2.
diurdeiǐui Ěifln
5. Moment generating function: )以
Z 1
M (t) = exp(tx)f (x)dx,
1
nrne
if there exists some h > 0 such that the above integral is finite for h < t < h.
fatidt ftdcē
"
"
_
Etéffé
"
2. Expectations ⼆ 8 比
Example 4
If p.d.f. of a random variable is f (x) = exp( x), x > 0, > 0
Z x Z x c.at
F (x) = f (t)dt = exp( t)dt = 1 exp( x), x > 0
XǗ
1 0
Z 1 Z 1 ft
µ = tf (t)dt = t exp( t)dt
1 0 Z
-
1
= [t exp ( t)]1
0 + exp ( t) dt (using integration by parts)
0 "' nnnnrrrte
1 [ xě
。
=
姦 lhopitisnte
-
2. Expectations 9
Another way to find µ is to use the moment generating function. For t < ,
-
Z 1 Z 1
M (t) = exp(tx) exp( x)dx = exp( ( t)x)dx =
0 0 t
1
µ = E(X) = M 0(0) =
liti (1)
We can also calculate the variance,
E X2 = M 00(0) =
2
=
2 哭
( 0)3 2
⼀
2 2 1 1
= E X2 µ2 = 2 =
善
2 2
" 叫 ⼀
4 1) -2 .la
-
t)
⾯
apinit.tl10
t.tl -_- ve
> 0
2. Expectations t > -1
灓
F (x) = f (t)dt
1 zèii
For x < 0, Z x
1 1 )(
F (x) = exp (t) dt = exp(x)
12 2
For x 0,
Z 0 Z x
1 1 1 1 1
F (x) = exp (t) dt + exp ( t) dt = [exp( x) 1] = 1 exp( x)
1 2 0 2 2 2 2
For |t| < 1,
Z Z
1 0 1 1 ……
M (t) =
2 1
exp(tx) exp(x)dx +
0
2 0
exp(tx) exp( x)dx
-.- 1
1 1 1 1 1
= exp [(t + 1) x] + exp [(t 1) x] =
2 t+1 1 2 t 1 0 1 t2
So, µ = M 0(0) = 0 and 2
= E(X 2) µ2 = M 00(0) 0=2
3. Uniform distribution and Exponential distribution
Uniform distribution 11
⼆
interval [a, b], are the same.
廳 :
X ⇠ U (a, b) with a < b if
1
f (x) = f or a x b, f (x) = 0 otherwise
b a
Using the p.d.f., we can find
x a
F (x) = f or a x b; F (x) = 0 f or x < a; F (x) = 1 f or x > b.
j
b a
t a+b
化 啦忐 比 µ =
2
(b a)2
2
=
12
exp(tb) exp(ta)
M (t) = f or t 6= 0, and M (0) = 1.
t(b a)
非
1無 dt
⼆
六 ftdt
雄
渝
-2
Èicbts ⼆
-
D
=
笑
f
2.ddtw-fti-t.CH
v0 =
(㙄 )
啊(六)
"比 dt
dt
⼆点 fltnta 樂了
) +
⼆点 1 㔯 㙣 埕班 㙣 舉到 ! +
⼆点 很
忙 以 [鄋 "
哶
⽔ 啊
(
(5 (
+ abtà )
=
⽅ (5
-
+
哭
L
》 如好
獸
⼀
" ⼀
呇
2
= 41,4
- 出占 比
_-
-
ā
⼆ 型 哈
=
,
Uniform distribution 12
rmi
the stop at a time that is uniformly distributed between 7:00a.m. and 7:30a.m., find
the probability that he has to wait more than 10 minutes for a bus.
and (7:15,7:20). Denote X be the number of minutes past 7:00 that the passenger
_
先
Exponential distribution 13
1 x
f (x) = exp( ), 0 x < 1.
兩
✓ ✓
⼀
Using the p.d.f., we can find
Z x Z x
1 t
F (x) = f (t)dt = exp( )dt
1 0 ✓ ✓
x
= 1 exp( ), f or 0 x < 1
✓
F (x) = 0, f or 1 < x < 0.
µ = ✓.
2
= ✓2.
1 1
M (t) = , t<
1 ✓t ✓
o
Remark: we omit the details of derivation here because it is the same as Example 4
above (by changing ✓ to 1/ ).
6 " "
1 > time
:
-
Exponential distribution 14
-_-
Later on, we can have more general result connecting the waiting time of the kth
occurence of a Poisson process with another continuous random variable.
" "
P ( w > 七) 1-7
-1 time
)
P (X
再
= 0
=
'
芃
樂 北
=
#
✗
otwst.me
=
Fwlw ) ⼆
PCWE "
)
banehit]
=
1 P (N_n)
Poisioulht )
-
✗ ~
=
, _
ě
Exponential distribution 15
Example 7:
Suppose that people immigrate into a country at a Poisson rate = 1 per day.
(a) What is the probability that the first immigrant arrive after two days? Ans: Let
W be the waiting time,
P (W > 2) = exp( 1 ⇥ 2) = exp( 2) = 0.135
(b) What is the expected waiting time for the first customer? Ans: W follows
exponential distribution of parameter 1. So the expected waiting time = E(W ) = 1
(c) If we already waited for 3 days, what will be the probability that the first
immigrant will arrive after the fifth day?
Ans:
P (W > 5, W > 3) P (W > 5) exp( 5)
P (W > 5|W > 3) = = = = exp( 2) = 0.135
P (W > 3) P (W > 3) exp( 3)
Note the same number we got from (a) and (c). It is the ”no memory” or ”mem-
oryless” property of exponential distribution. Geometric distribution also has this
property. nrrnne ,
t-tgammal-7-fi.it○
4. Gamma distribution
1-,
⼭
:
' " 了
For a Poisson process, if we are now interested in the waiting time until the ↵th
「 16
↵ 1
" #
X exp( w) ( w) k
exp( w) ( w) k 1
f (w) = F 0(w) = exp( w) +
k! (k 1)!
k=1
↵ 1
" #
X ( w)k k 1
( w) Eqi 9 4
= exp( w) + exp( w)
k! (k 1)!
k=1
˙
= exp( w) + exp( w)
( w)↵ 1
(↵ 1)!
exp( w) =
⼝
( w)↵ 1 exp(
(↵ 1)!
w)
籤
Gamma distribution
點
⼀ ☐ 17
l-ED-sjustastmtfl.ie
w↵ 1 exp( w/✓)
f (w) = , 0 w
✓↵ (↵ 1)!
We have:
Z 1 Z 1 ↵ 1
w↵ 1 exp( w/✓) w exp( w( 1✓ t))
M (t) = exp(wt) ↵ (↵
dw = ↵ (↵
dw
0 ✓ 1)! 0 ✓ 1)!
1 ↵Z 1 ↵ 1 1
( t) w exp( w( ✓ t)) 1
= ✓ ↵ 1 ↵ (↵
dw = ↵ f or t < 1/✓
✓ 0 ( ✓ t) 1)! (1 ✓t)
Hence, "
'
0
-
µ = M (0) = ↵✓
2
= M ”(0) (M 0(0))2 = (↵ + 1)↵✓2 ↵2✓2 = ↵✓2 Uui
ělōt )
"
^
w
( all U_U 哎
Üēǜ
※
ō 囓⼼
Gamma distribution 18
Example 8
Phone calls arrive at a mean rate of = 2 per minute according to Poisson Process.
Let W be the waiting time in minutes until the fourth call, what is the distribution
of W ? What is the expected time you need to wait for the arrival of the fourth call?
.co
Phone calls arrive at a mean rate of λ = 4 per minute according to Poisson Process.
Let W be the waiting time in minutes until the second call, what is the distribution
of W? Find P(W>1).
Answer:
Clearly W ~ Gamma(2,1/4). n.fi ⼆年
Method 1:
Let X = number of arrivals in [0,1]. Then X ~ Poisson(4)
4e − 4
P(W > 1) = P( X = 0) + P( X = 1) = e −4 + = 5e −4 = 0.09158
1!
Method 2:
w 2−1e −4 w
fW ( w) = = 16 we −4 w
(1 / 4) (2 − 1)!
2
∞ ∞ ∞ ∞
−1
P(W > 1) = ∫ f ( w)dw = ∫ 16 we −4 w
dw = 16 ∫ we −4 w
dw = 16 we −4 w |1∞ − ∫ e −4 w dw
1 1 1
4 1
∞
1 −4 w ∞ 1
= −4 we −4 w |1∞ − ∫ e −4 w dw = −4 − e −4 + e |1 = −4 − e −4 − e −4 = 5e −4 = 0.09158
1
4 4
悥
7.779.
Answer:
X ~ χ (24 )
P( X > 7.779) = 0.1 (from table IV of textbook)
Let Y be the number of observations exceeding 7.779 of the 15 observations.
Y ~ b(15,0.1)
3
P(Y ≤ 3) = ∑ 15 C k (0.1) k (0.9)15− k = 0.9444
k =0
Gamma function 19
For t > 1,
(t) = y t 1 exp( y)dy , f or t > 0
0
1711 :Piěd ,
Z hyparts ⼭
fé
" "
1
了
Eidié
⼀
t 1
(t) = y exp( y)dy -
0 Z 1
⼆ 1
⇥ ⇤1
= yt 1
exp( y) 0
+ (t 1) yt 2
exp( y)dy = (t 1) (t 1)
0
It is easy to check that (1) = 1. Thus, for t = n where n is an integer greater than
1, we have (n) = (n 1)! nis atue itger
Tln ) TG -1 )
lnttn-nhn.EE
=
= (n_n
n-4.in/=Cn-l)!
)(
Gamma distribution (allowing non-integer ↵) 20
Now, a general definition of a Gamma random variable with parameters ↵ > 0, ✓ > 0
⼆
has p.d.f.:
w↵ 1
exp( w/✓)
f (w) = , w 0
✓↵ (↵)
⼀
µ = ↵✓, 2
= ↵✓2, M (t) =
1
(1 ✓t)↵
f or t < 1/✓ 版,
Xhusetlinngpóuesistesting
Chi-squared distribution 21
_
related to Normal distribution (the subject of next section) and is very useful in
statistical inference.
For X ⇠ 2 (r) ,
1 r 1 x
f (x) = r rx
2 exp( ), f or x 0
2 22 2
_
r
µ= ⇥2=r
⇣ r2⌘
_
2
= ⇥ 22 = 2r
2
1 1
M (t) = r, t<
(1 2t) 2 2
2
Table IV in the Appendix B of the Textbook contains the c.d.f. values for
distributions.
!!
Mostimportnt
5. Normal distribution 22
Normal distribution is widely used in statistics because of the central limit theorem
that says under very weak assumptions, the sum of a large number of i.i.d. random
variables is approximately Normally distributed, regardless of the distributions of
the individual random variables. We will study this theorem in chapter 6.
-_-
_
-_- 1
f (x) = p exp(
2⇡
(x µ)2
2 2
) f or 1<x<1
_
we use µ, 2 as the symbols of our parameters because they corresponds to the mean
and variance of the Normal distribution.
Moment Generating Function 23
2
! Z 1
µ + 2t µ2 1 (x (µ + 2t))2
= exp p exp( )dx
2 2 1 2⇡ 2 2
! ✓ ◆ tpdf of
= exp
µ+ t 2 2
µ 2
= exp µt +
2 2
t Nlmtioj
2 2 2
Normal distribution 24
Therefore
!
2 2
(0)
E(X) = M 0(0) = exp µ (0) + µ+ 2
(0) = µ
_
_
2 =
✓ 2 2
◆
2 00 d t 2
E(X ) = M (0) = exp µt + µ+ t | 0 = µ2 + 2
-_-
_
⼀⼀
⼀⼀
dt 2 ⼀⼀
⼀⼀
V ar(X) = E(X 2)
µ = 2 2
Remarks:
⼀⼀
1, The c.d.f. of N (µ, 2) cannot be obtained in closed form. We have to transform
the Normal distribution into standard Normal distribution (to be discussed in next
subsection) in order to find the cumulative probabilities using a table (Table Va, Vb
of Appendix B of the textbook).
2
2, N (µ, ) is symmetric about µ, unimodal and bell-shaped.
N (0, 1) 25
mi
the standard Normal distribution.
Theorem:
2 X µ
If X ⇠ N (µ, ), then Z = ⇠ N (0, 1).
Proof:
R µ+ z (x µ)2
P (Z < z) = P ( X µ
< z) = P (X < µ + z) = 1
p1
2⇡
exp( 2 2
)dx
-
By change of variable: x = µ + w
Z z
1 w2
P (Z < z) = p exp( )dw
1 2⇡ 2
Thus Z ⇠ N (0, 1).
Standardization 26
The consequence is that we do not need a table for each Normal distribution, we
only need one table for Standard Normal Distribution.
…
Example: i
Suppose that X ⇠ N (3, 4). Find P (X < 5) and P (0 < X < 5).
.PE
-_- -
X 3 5 3
P (X < 5) = P ( < ) = P (Z < 1)
2 2
According to the Normal table, a)
P (X < 5) = P (Z < 1) = 0.8413
x-PCZ.cl i)
0 3
P (0 < X < 5) = P ( < Z < 1) = P ( 1.5 < Z < 1)
o_o -750
,2 PLZC -1,5 )
= 0.8413 (1 0.9332) = 0.7745
(By the table and also symmetry of Z about 0).
=
P (E). 51
2
Relationship with (1) 27
0
investigate the relationship for 2 (1). We have the result,
X ⇠ N (0, 1) =) Y = X 2 ⇠ 2
(1) 廠家
-
Proof:
䟐
p p
P (Y < y) = P (X 2 < y) = P ( y < X < y)
Z py ⼀⼀
nnnth~ui.sqnaei-ne.前fmF
⼀⼀
1 x2
= p
p exp( )dx
y 2⇡ 2
Z py
1 x2
= 2 p exp( )dx
0 2⇡ 2
p
By change of variable x = w, that implies dx = 2p1 w dw
P (Y < y) =
Ry
0
p1
2⇡w
exp( w
2 )dw
-淐
涵慧
1 y
Thus the p.d.f. of Y is f (y) = p12⇡ y 2 1 exp(2)
1 p
This is p.d.f. of 2 (1) (as a result, we also get 2 = ⇡)
Appendix: Fundamental Theorem of Calculus 28
Appendix: Integration by parts 29