Lec 2
Lec 2
Lec 2
Conditional Probability
Motivation
P(A ∩ B)
P(A|B) =
P(B)
I Here A is the event whose uncertainty we want to update, and B is the
evidence we observe (or want to treat as given). We call P(A) the prior
probability of A and P(A|B) the posterior probability of A (“prior” means
before updating based on the evidence, and “posterior” means after
updating based on the evidence).
I When we write P(A|B), it does not mean that A|B is an event and we’re
taking its probability; A|B is not an event. Rather, P(.|B) is a probability
function which assigns probabilities in accordance with the knowledge
that B has occurred, and P(.) is a different probability function which
assigns probabilities without regard for whether B has occurred or not.
When we take an event A and plug it into the P(.) function, we’ll get a
number, P(A); when we plug it into the P(.|B) function, we’ll get
another number, P(A|B), which incorporates the information (if any)
provided by knowing that B occurred.
I For any event A, P(A|A) = P(A∩A)
P(A)
= 1.
Definition and Intuition (contd.)
Figure 1
P(A1 , ..., An ) = P(A1 )P(A2 |A1 )P(A3 |A1 , A2 ).......P(An |A1 , ......An−1 )
Theorem 3.
Bayes’ rule:
P(B|A)P(A)
P(A|B) =
P(B)
Figure 2
The law of total probability tells us that to get the unconditional probability of
B, we can divide the sample space into disjoint slices Ai , find the conditional
probability of B within each of the slices, then take a weighted sum of the
conditional probabilities, where the weights are the probabilities P(Ai ).
Law of total probability (contd.)
Theorem 4.
Law of total probability (LOTP): Let A1 , ....An be a partition of the
sample space S (i.e., the Ai are disjoint events and their union is S, with
P(Ai ) ≥ 0 for all i. Then:
n
X
P(B) = P(B|Ai )P(Ai )
i=1
Theorem 5.
Bayes’ rule with extra conditioning: Provided that P(A ∩ E ) > 0
and P(B ∩ E ) > 0, we have
P(B|A, E )P(A|E )
P(A|B, E ) =
P(B|E )
Theorem 6.
LOTP with extra conditioning: Let A1 , ..., An be a partition of S.
Provided that P(Ai ∩ E ) > 0 for all i, we have
n
X
P(B|E ) = P(B|Ai , E )P(Ai |E )
i=1
Conditional probabilities are probabilities (contd.)
I You have one fair coin, and one biased coin which lands Heads
with probability 3/4. You pick one of the coins at random and
flip it three times. Suppose that we have now seen our chosen
coin land Heads three times. If we toss the coin a fourth time,
what is the probability that it will land Heads once more?
Solved in class
Conditional probabilities are probabilities (contd.)
P(A, B, C )
P(A|B, C ) =
P(B, C )
P(B|A, C )P(A|C )
P(A|B, C ) =
P(B|C )
P(A ∩ B) = P(A)P(B).
P(A|B) = P(A),
P(A ∩ B) = P(A)P(B),
P(A ∩ C ) = P(A)P(C ),
P(B ∩ C ) = P(B)P(C ),
P(A ∩ B ∩ C ) = P(A)P(B)P(C ).