Introduction To Probability Theory: Rong Jin
Introduction To Probability Theory: Rong Jin
Introduction To Probability Theory: Rong Jin
Theory
Rong Jin
Outline
Definition of Probability
Axiom 1: Pr(A) 0
Axiom 2: Pr(S) = 1
Axiom 3: For every sequence of disjoint events
Pr(Ui Ai ) i Pr( Ai )
Joint Probability
Independence
Ai ) i Pr( Ai )
Independence
Ai ) i Pr( Ai )
A = {A patient is a Women}
Women
Men
B = {Drug fails}
Success
200
1800
Failure
1800
200
Independence
Example II:
A = {HT}, B = {TH}
Will event A independent from event B?
Disjoint Independence
Conditioning
Pr( AB)
Pr( A)
Conditioning
Pr( AB)
Pr( A)
A = {Patient is a Women}
Women
Men
B = {Drug fails}
Success
200
1800
Pr(B|A) = ?
Failure
1800
200
Pr(A|B) = ?
Conditioning
Pr( AB)
Pr( A)
A = {Patient is a Women}
Women
Men
B = {Drug fails}
Success
200
1800
Pr(B|A) = ?
Failure
1800
200
Pr(A|B) = ?
A = {Using Drug I}
Drug I
Drug II
Success
219
1010
C = {Drug succeeds}
Failure
1801
1190
Pr(C|A) ~ 10%
Pr(C|B) ~ 50%
Female Patient
A = {Using Drug I}
B = {Using Drug II}
C = {Drug succeeds}
Pr(C|A) ~ 20%
Pr(C|B) ~ 5%
Female Patient
Male Patient
A = {Using Drug I}
A = {Using Drug I}
C = {Drug succeeds}
C = {Drug succeeds}
Pr(C|A) ~ 20%
Pr(C|A) ~ 100%
Pr(C|B) ~ 5%
Pr(C|B) ~ 50%
Drug
I is better thanMale
Drug
II
Patient
Female
Patient
A = {Using Drug I}
A = {Using Drug I}
C = {Drug succeeds}
C = {Drug succeeds}
Pr(C|A) ~ 20%
Pr(C|A) ~ 100%
Pr(C|B) ~ 5%
Pr(C|B) ~ 50%
Conditional Independence
Outline
Bayes Rule
Given two events A and B and suppose that Pr(A) > 0. Then
Pr( A)
Pr( A)
Example:
Pr(R) = 0.8
Pr(W|R)
0.7
0.4
0.3
0.6
R: It is a rainy day
W: The grass is wet
Pr(R|W) = ?
Bayes Rule
R
0.7
0.4
0.3
0.6
R: It rains
W: The grass is wet
Information
Pr(W|R)
W
Inference
Pr(R|W)
Bayes Rule
R
0.7
0.4
0.3
0.6
R: It rains
W: The grass is wet
Information: Pr(E|H)
Hypothesis H
Posterior
Likelihood
Inference:
Pr(H|E)
Pr( E | H ) Pr( H )
Pr( H | E )
Pr( E )
Evidence E
Prior
Bi I B j ;
Ui Bi S
Pr( A | Bi ) Pr( Bi )
Pr( Bi | A)
Pr( A)
Pr( A | Bi ) Pr( Bi )
k
j 1 Pr( AB j )
Pr( A | Bi ) Pr( Bi )
k
Pr( B j ) Pr( A |
j 1
Bj )
Bi I B j ;
Ui Bi S
Pr( A | Bi ) Pr( Bi )
Pr( Bi | A)
Pr( A)
Pr( A | Bi ) Pr( Bi )
k
j 1 Pr( AB j )
Pr( A | Bi ) Pr( Bi )
k
Pr( B j ) Pr( A |
j 1
Bj )
Bi I B j ;
Ui Bi S
Pr( A | Bi ) Pr( Bi )
Pr( Bi | A)
Pr( A)
Pr( A | Bi ) Pr( Bi )
k
j 1 Pr( AB j )
Pr( A | Bi ) Pr( Bi )
k
Pr( B j ) Pr( A |
j 1
Bj )
U
Pr(R) = 0.8
It rains
Pr(UW|R)=Pr(U|R)Pr(W|R)
Pr(UW| R)=Pr(U| R)Pr(W| R)
Pr(W|R)
Pr(U|R)
0.7
0.4
0.9
0.2
0.3
0.6
0.1
0.8
Pr(U|W) = ?
U
Pr(R) = 0.8
It rains
Pr(UW|R)=Pr(U|R)Pr(W|R)
Pr(UW| R)=Pr(U| R)Pr(W| R)
Pr(W|R)
Pr(U|R)
0.7
0.4
0.9
0.2
0.3
0.6
0.1
0.8
Pr(U|W) = ?
U
Pr(R) = 0.8
It rains
Pr(UW|R)=Pr(U|R)Pr(W|R)
Pr(UW| R)=Pr(U| R)Pr(W| R)
Pr(W|R)
Pr(U|R)
0.7
0.4
0.9
0.2
0.3
0.6
0.1
0.8
Pr(U|W) = ?
Outline
Pr( X x) p ( x)
Discrete case:
b
Continuous case: Pr(a X b) a p ( x)dx
Expectation
1
E[ X ]
N
Continuous case:
N
x
i 1 i
E[ X ]
xp ( x)dx
Expectation: Example
Variance
Bernoulli Distribution
Binomial Distribution
p
)
Pr( X x) p ( x) x
x 1, 2,..., n
otherwise
Poisson Distribution
Pr( X x) p ( x) x! e
0
E[X] = , Var(X) =
x0
otherwise
X~N(,)
p ( x)
( x ) 2
exp
2
2
2
2
1
Pr(a X b) p ( x)dx
1
2 2
exp
( x ) 2
dx
2
2
E[X]= , Var(X)= 2
If X1~N(1,1) and X2~N(2,2), X= X1+ X2 ?