STAT422 - Slides
STAT422 - Slides
STAT422 - Slides
Fastel Chipepa1
February 7, 2024
1 CHAPTER 1
1.1: Sample Space and Events
1.2: Probabilities Defined on Events
1.3: Conditional Probability
1.4: Independent Events
1.5: Bayes’ Formula
2 CHAPTER 2
2.1: Random Variables
2.2: Discrete Random Variables
2.3: Continuous Random Variables
2.4: Expectation of a Random Variable
2.5: Jointly Distributed Random Variables
8
n1 En , is defined to be the event that consists of all outcomes that are
in En for at least one value of n 1, 2, 3, ....
8
n1 En , is defined to be the event consisting of those outcomes that are
in all of the events En , n 1, 2, 3, ....
For an event E, E c (compliment of E) consists of all outcomes in the
sample space S that are not in E.
Given a sample space S, for each event E, we assume that a number P(E)
is defined and satisfies the following 3 conditions
i 0 ¤ P pE q ¤ 1
ii P pS q 1
iii For any sequence of events E1 , E2 , ..., that are mutually exclusive, then
pp 8
°8
n1 En q n1 P pEn q, where P(E) is the probability of event E.
Since the events E and E c are always mutually exclusive and since
E Y E c S, we have by (ii) and (iii) that
1 P pS q P pE Y E c q P pE q P pE c q or P pE c q 1 P pE q
For events E and F P pE q P pF q P pE Y F q P pEF q or
P pE Y F q P pE q P pF q P pEF q, where EF E X F .
P pE Y F q P pE q P pF q if E and F are mutually exclusive.
Homework 1
Conditional Probability
Homework 2
1. Suppose an urn contains seven black balls and five white balls. We draw
two balls from the urn without replacement. Assuming that each ball in
the urn is equally likely to be drawn, what is the probability that both
drawn balls are black?
2. Suppose that each of three men at a party throws his hat into the center
of the room. The hats are first mixed up and then each man randomly
selects a hat. What is the probability that none of the three men selects
his own hat?
Independent Events
Bayes’ Formula
P pE q P pE X F q P pE X F c q
P pE |F qP pF q P pE |F c qP pF c q
P pE |F qP pF q P pE |F c qp1 P pF qq
Example 1.3: Consider two urns. The first contains two white and seven
black balls, and the second contains five white and six black balls. We flip
a fair coin and then draw a ball from the first urn or the second urn
depending on whether the outcome was heads or tails. What is the
conditional probability that the outcome of the toss was heads given that
a white ball was selected?
Solution: Let W be the event that a white ball is drawn, and let H be the
event that the coin comes up heads. The desired probability P pH |W q
may be calculated as follows
P pH X W q
P pH |W q P pW q
P pWP|HpWqPq pH q
P pW |H qP pH q
P pW |H qP pH q P pW |H c qP pH c q
21
21
92
5 1 22
67
92 11 2
Suppose
n that F1 , F2 , ..., Fn are mutually exclusive events such that
i 1 i
F S. In other words, exactly one of the events F1 , F2 , ..., Fn will
occur.nBy writing
E i1 EFi and using the fact that the events EFi , i 1, 2, ..., n, are
mutually exclusive, we obtain that
¸
n
P pE q P pEFi q
i 1
¸n
P pE |Fi qP pFi q
i 1
P pFj |E q P EFj
P pE q
P pE |Fi qP pFi q
°n
i1 P pE |Fi qP pFi q
Homework 3
Random Variables
Homework 4
1. Suppose that an airplane engine will fail, when in flight, with probability
1 p independently from engine to engine; suppose that the airplane will
make a successful flight if at least 50 percent of its engines remain
operative. For what values of p is a four-engine plane preferable to a
two-engine plane?
2. Consider an experiment that consists of counting the number of
α-particles given off in a one-second interval by one gram of radioactive
material. If we know from past experience that, on the average, 3.2 such
α-particles are given off, what is a good approximation to the probability
that no more than two α-particles will appear?
The function f pxq is called the probability density function of the random
variable X.
"
f p xq
1, 0 x 1
0, otherwise
In general, we say that Xis a uniform random variable on the interval
pα, β q if its pdf is given by
" 1
f pxq
β α α x β
0 otherwise
Example 1.4: Calculate the cumulative distribution function of a random
variable uniformly distributed over pα, β q.
$
& 0, a¤α
F pxq
a α
β α, α a β
α¥β
%
1,
Proof: Suppose that α ¡ 0 and note that FY p.q , the cdf of the r.v Y , is
given by
FY paq P tY ¤ au
P tαX β ¤ au
" *
aβ
P X¤
α
aβ
FX
α
» paβ q{α
pxµq2
?1 e 2σ 2 dx
»
8 2πσ
" *
a
? 1 exp pv pαµ β qq2
dv
8 2πασ 2α2 σ 2
where the last equality is³ obtained by the change of variables v αx β
However, since FY paq 8 fY pv qdv, it follows that the pdf fY p.q is given
a
by " *
fY pv q ?
1
exp
p v pαµ β qq2
, 8 v 8.
2πασ 2α2 σ 2
Hence, Y is normally distributed with parameters αµ β and α2 σ 2 .
F. Chipepa February 7, 2024 27 / 51
CHAPTER 2 2.4: Expectation of a Random Variable
¸
n
E rX s iP piq
i 0
¸n
p p1 pqni
n i
i
i 0
i
¸n
pi p1 pqni
in!
i1
p n i q!i!
¸
n
pn 1q! pi1 p1 pqni
np
i1
pn iq!pi 1q!
1 n 1
n¸
np pk p1 pqn1k
k 0
k
nprp p1 pqsn1
np,
where the second from the last equality follows by letting k i 1.
F. Chipepa February 7, 2024 29 / 51
CHAPTER 2 2.4: Expectation of a Random Variable
8̧
E rX s npp1 pqn1
n 1
8̧
p nq n1 pwhere q 1 pq
n 1
8̧ d
p pqn q
n1
dq
8̧
p
d
qn
dq
n 1
p
d
dq 1 q
q
1
p1 q q2
1
p
.
F. Chipepa February 7, 2024 30 / 51
CHAPTER 2 2.4: Expectation of a Random Variable
8̧ iλi eλ
E rX s
i0
i!
8̧ λi eλ
pi 1q!
i 1
8̧ λi1
λeλ
i 1
pi 1q!
8̧ λk
λeλ
k 0
k!
λeλ eλ
λ
°8
where we use the identity
λk
k 0 k! eλ .
F. Chipepa February 7, 2024 31 / 51
CHAPTER 2 2.4: Expectation of a Random Variable
β 2 α2
2pβ αq
β α
2
.
»8
E rX s xλeλx dx
o
1
λ
.
»8
pxµq2
E rX s ?1 xe 2σ 2 dx
2πσ 8
Writing x as px µq µ yields
»8 »8
pxµq2 pxµq2
E rX s ?1 px µqe 2σ 2 dx µ?
1
e 2σ 2 dx
2πσ 8 2πσ 8
Letting y x µ leads to
»8 »8
E rX s ?1 ye 2σ2 dx f pxqdx
y2
µ
2πσ 8 8
where f pxq is the ³normal density. By symmetry, the first integral must be 0,
8
and so E rX s µ 8 f pxqdx µ.
F. Chipepa February 7, 2024 34 / 51
CHAPTER 2 2.4: Expectation of a Random Variable
aE rX s b
F. Chipepa February 7, 2024 35 / 51
CHAPTER 2 2.4: Expectation of a Random Variable
Homework 5
Given two discrete random variables X and Y , the joint probability mass
function of X and Y is given by
ppx, y q P tX x, Y y u.
of X may be obtained from ppx, y q by
The pmf °
pX pxq y:ppx,yq¡0 ppx, y q.
Similarly,°
pY py q x:ppx,yq¡0 ppx, y q.
We say X and Y are jointly continuous if there exists a function f px, y q,
defined for all real x and y, having the property that for all sets A and B
of real numbers ³ ³
P tX P A, y P B u B A f px, y qdxdy
The marginal
³ 8 pdfs of X and Y given³ 8 by
fX pxq 8 f px, y qdy and fY py q 8 f px, y qdx.
1 py { q,
f px, y q e x y
,0 x, y 8
y
Also
»8 »8
E rXY s xyf px, y qdydx
8 8 »
»8 8x
yey ex{y dydx
0 0 y
»8
y 2 ey dy
0
Properties of Covariance
We can obtain a useful expression for the variance of the sum of random
variables as follows:
¸
n ¸
n ¸
n
V ar Xi Cov Xi , Xj
i 1
i 1
i j
¸
n ¸
n
Cov pXi , Xj q
i 1j 1
¸n ¸
n ¸
Cov pXi , Xi q Cov pXi , Xj q
i 1
i 1j i
¸n ¸
n ¸
V arpXi q 2 Cov pXi , Xj q
i 1
i 1j i
σ2 2
n
σn 0
F. Chipepa February 7, 2024 48 / 51
CHAPTER 2 2.5: Jointly Distributed Random Variables
Under these two conditions it can be shown that the random variables Y1
and Y2 are jointly continuous with joint density function given by
fY1 ,Y2 py1 , y2 q fX1 ,X2 px1 , x2 q|J px1 , x2 q|1 (2)
eλpx yq xα1 y β 1
α β
λ
ΓpαqΓpβ q
Γpα β q α1
fV pv q
ΓpαqΓpβ q
v p1 vqβ1 , 0 v 1. (3)