Lecture 02
Lecture 02
P A B P A P B | A P B A P B P A | B
Problem 1
Q. Two events A and B are such that P(A)=0.7, P(B)=0.4 and
P(A|B)=0.3. Determine the probability that neither A nor B
occurs.
Soln:
P(AB) = P(B)xP(A|B) = 0.4x0.3 = 0.12
P(A|B) = P(A)
P(B|A) = P(B)
In other words,
m
P A PBi P A | Bi
i 1
Hence, if S B B c then
P A P B P A | B P B c P A | B c
Bayesian Statistics
• Bayes theorem has led to the development of an approach in
statistics Bayesian Statistics
• Judgment may be combined with empirical evidence.
• Consider a random experiment with two main events, E and Ec
(not E), which are assumed by judgment. The following
conditional probabilities are available from past experience
with similar random experiments: P(R|E) and P(R|Ec), where R
is a particular test result.
• The conditional probability for the main event E can be
calculated using the following:
P E P R | E
P E | R
P E P R | E P E c P R | E c
Bayesian Statistics Contd…
• The sources of probabilities employed are usually partly
judgmental and partly empirical. Bayes theorem shows how
to combine subjective and objective probabilities
meaningfully. Its general form may be written as follows:
P B j P A | B j P A B j
P B j | A
PB P A | B
m
P A
i 1 i i
10 5
0.1 1 10 0.001
5
Contd…..
(b) Also determine the probability of a major earthquake, given
that both premonitory events A and B are observed at the
same time.
Assume that both premonitory events A and B are independent.
P A B | E P A | E P B | E 0.10.1
P A B | E c P A | E c P B | E c 0.0010.001 10 6
Applying Bayes' Theorem
P E P A B | E
P E | A B
P E P A B | E P E c P A B | E c
10 0.01
5
0.09
10 5
0.01 1 10 10
5 6
Random Variable
=>The measured (observed) value of a quantity (parameter) to
be obtained in a random experiment (observation) is
unknown, and is treated as a variable known as random
variable.
Observed Value
=>Result of the measurement (observation)
Probability Distributions
• Shows how the total probability of a Random Variable is
allocated among its possible values using the:
– Probability mass function, PMF, for discrete Random
Variables, or
– Probability density function, PDF, for continuous Random
Variables.
px PX x
1/8 PX x 1
0 1 2 3 x
• Cumulative Distribution Function
F x PX x px
for all X x
• Expected Value
E X xp x
F(x)
p(x)
pmf cdf
Continuous Probability Distribution
• A continuous random variable has an infinite (or nearly infinite) set of
possible values in a given interval
• Distributions of continuous data are characterized by probability density
functions f(x)
• For RVs that map to the integers or the real numbers, the cumulative
density function (cdf) is a useful alternative representation
f (x)
f ( x)dx 1
x
Jointly Distributed Random Variable
Direct applications of probability may involve two or more
random variables.
=>Example: Waiting time in queue may be considered as
composite of ‘arrival time’ and ‘service time’.
P X xi PX xi , Y y j pxi , y j
j j