Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
7 views

Lecture 02

The document covers various concepts in probability theory, including conditional probability, independent events, joint probability, and Bayes' theorem. It provides examples and problems to illustrate how to calculate probabilities in different scenarios, such as detecting diseases and predicting events based on prior knowledge. Additionally, it discusses random variables and probability distributions, both discrete and continuous, along with their associated functions.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Lecture 02

The document covers various concepts in probability theory, including conditional probability, independent events, joint probability, and Bayes' theorem. It provides examples and problems to illustrate how to calculate probabilities in different scenarios, such as detecting diseases and predicting events based on prior knowledge. Additionally, it discusses random variables and probability distributions, both discrete and continuous, along with their associated functions.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

CE 207

Applied Mathematics for Engineers


Dr. Md. Hadiuzzaman
Professor, Dept. of CE, BUET
CE Building, Room No.: 544
e-mail: mhadiuzzaman@ce.buet.ac.bd
Conditional Probability
The probability that the event B occurs given the information
that the event A occurs is expressed as P(B|A) which is given as:

P A  B  Probability that both events A and B occur


P B | A  
Also, P  A Probability of event A

P  A  B   P  A  P B | A  P B  A  P B   P  A | B 
Problem 1
Q. Two events A and B are such that P(A)=0.7, P(B)=0.4 and
P(A|B)=0.3. Determine the probability that neither A nor B
occurs.
Soln:
P(AB) = P(B)xP(A|B) = 0.4x0.3 = 0.12

P(AB) = P(A) + P(B) - P(AB) = 0.7 + 0.4 - 0.12 = 0.98

The probability that events A, B and both A and B can occur is


given by P(AB)

Hence, the probability that neither A nor B occurs is


= 1 - P(AB) = 1 – 0.98 = 0.02
Independent Events
If A and B are two statistically independent events with non-
zero probabilities then:

P(A|B) = P(A)
P(B|A) = P(B)

P(AB) = P(A) x P(B)

(Note: Physically independent events are always statistically


independent)
Joint Probability
P(E and F)=P(EF)=P(E) x P(F)
Joint Probability that the outcome of first experiment is
contained in E and that the outcome of second experiment is
contained in F, provided that E and F are independent is given
by this equation. By independent events, we mean that the
outcome of one event does not influence the outcome of the
other event.
P(EF) means the probability that both E and F occurs.
The above multiplication law for independent events extends
to any number of independent events.
Bayes’ Theorem
B1, B2, …., Bm are m mutually exclusive and exhaustive events
in Sample Space S.
i.e., S = B1 B2 ….  Bm
Let A be some other event. Then
m
P A   P A  Bi 
i 1

In other words,
m
P A   PBi  P  A | Bi 
i 1

Hence, if S  B  B c then
  
P  A  P B   P  A | B   P B c  P A | B c 
Bayesian Statistics
• Bayes theorem has led to the development of an approach in
statistics Bayesian Statistics
• Judgment may be combined with empirical evidence.
• Consider a random experiment with two main events, E and Ec
(not E), which are assumed by judgment. The following
conditional probabilities are available from past experience
with similar random experiments: P(R|E) and P(R|Ec), where R
is a particular test result.
• The conditional probability for the main event E can be
calculated using the following:
P E   P R | E 
P E | R  
  
P E   P R | E   P E c  P R | E c 
Bayesian Statistics Contd…
• The sources of probabilities employed are usually partly
judgmental and partly empirical. Bayes theorem shows how
to combine subjective and objective probabilities
meaningfully. Its general form may be written as follows:

P B j  P A | B j  P A  B j 
P B j | A  
 PB  P A | B 
m
P  A
i 1 i i

• There are m alternative previous events B1, B2,…,Bm that could


have happened which are mutually exclusive and exhaustive.
Problem 1
Q. You might be interested in finding out a patient’s probability of having liver
disease if they are an alcoholic. “Being an alcoholic” is the test (kind of like a
litmus test) for liver disease.
• A could mean the event “Patient has liver disease.” Past data tells you that
10% of patients entering your clinic have liver disease. P(A) = 0.10.
• B could mean the litmus test that “Patient is an alcoholic.” Five percent of
the clinic’s patients are alcoholics. P(B) = 0.05.
• You might also know that among those patients diagnosed with liver
disease, 7% are alcoholics. This is your B|A: the probability that a patient
is alcoholic, given that they have liver disease, is 7%.
• Bayes’ theorem tells you:
P(A|B) = (0.07 * 0.1)/0.05 = 0.14

• In other words, if the patient is an alcoholic, their chances of having liver


disease is 0.14 (14%). This is a large increase from the 10% suggested by
past data. But it’s still unlikely that any particular patient has liver disease.
Problem 2
Q. Probability that a lab (experiment) will detect life, for the case there is
life on Planet Fedra is P[B|A]=0.5
Probability that a lab will erroneously detect life, for the case there is no
life on Planet Fedra is P[B|Ac]=0.1.
Prior probability there is life on Planet Fedra is P[A]=0.1. This applies
before the lab is performed. What happens after the experiment is done?
Soln:
Probability that a lab will detect life (using total probability theorem), P(B)
= P(A)P(B|A)+P(Ac)P(B|Ac) = 0.1(0.5)+(1-0.1)(0.1) = 0.14
Using Bayes’ Theorem, probability there is life on Planet Fedra, given that
the lab detects life is given by:
P  A  P B | A 0.10.5
P A | B     0.357
P B  0.14
Problem 3
Q. A plane is missing. Assume that it is equally likely to have gone
down in any of three possible regions. Let i denote the probability
that the plane will not be found upon a search in the region i, when
the plane is in fact in that region. What is the conditional
probability that the plane is in the region i, given the condition that
the search for plane in region 1 was unsuccessful.
Soln:
Let Ri, i=1,2,3, be the event the plane is in region i, and let E be the
event that a search in region 1 was unsuccessful.
PR1  PE | R1  1 31
PR1 | E   3 
i 1PRi  PE | Ri  1 31  1 31  1 31
P  R2   P  E | R2  1 3 *1
P  R2 | E   3 
 i 1P  Ri   P  E | Ri  1 3 1  1 31  1 31
Problem 4
Q. Daily probability that a major earthquake occurs P[E] = 10-5.
Probability that premonitory event A or B occurs given that
major earthquake occurs is P[A|E] = P[B|E] = 0.1
Probability that premonitory event A or B occurs given that
major earthquake does not occur is P[A|Ec] = P[B|Ec] = 0.001

(a) Determine the probability of a major earthquake, given


that premonitory event A is observed.

Using Bayes’ Theorem:


P E   P  A | E 
P E | A 
  
P E   P  A | E   P E c  P A | E c 

10 0.1
5
 10 3

10 5
0.1  1  10 0.001
5
Contd…..
(b) Also determine the probability of a major earthquake, given
that both premonitory events A and B are observed at the
same time.
Assume that both premonitory events A and B are independent.
P  A  B | E   P  A | E   P B | E   0.10.1
     
P A  B | E c  P A | E c  P B | E c  0.0010.001  10 6
Applying Bayes' Theorem
P E   P  A  B | E 
P E | A  B  
  
P E   P  A  B | E   P E c  P A  B | E c 

10 0.01
5
 0.09
10 5
0.01  1  10 10 
5 6
Random Variable
=>The measured (observed) value of a quantity (parameter) to
be obtained in a random experiment (observation) is
unknown, and is treated as a variable known as random
variable.

Observed Value
=>Result of the measurement (observation)
Probability Distributions
• Shows how the total probability of a Random Variable is
allocated among its possible values using the:
– Probability mass function, PMF, for discrete Random
Variables, or
– Probability density function, PDF, for continuous Random
Variables.

Probability Distribution Cumulative Distribution


prob mass function
Discrete
(pmf) cumulative distribution
prob density function function (cdf)
Continuous
(pdf)
Discrete Probability Distribution
• This distribution characterizes the probability associated with each
possible random variable X which takes discrete values in a finite
sequence x1, x2, ....., xn or in an infinite sequence x1, x2, ...... Often discrete
random variables are nonnegative integer values as in the example below.

• Distributions of discrete data are characterized by probability mass


functions p(x)
Probability that the random variable X takes value x
3/8

px   PX  x
1/8  PX  x  1
0 1 2 3 x
• Cumulative Distribution Function

F x   PX  x   px 
for all X  x

• Expected Value

E  X    xp x 
F(x)

p(x)

pmf cdf
Continuous Probability Distribution
• A continuous random variable has an infinite (or nearly infinite) set of
possible values in a given interval
• Distributions of continuous data are characterized by probability density
functions f(x)
• For RVs that map to the integers or the real numbers, the cumulative
density function (cdf) is a useful alternative representation


f (x)
 f ( x)dx  1

x
Jointly Distributed Random Variable
 Direct applications of probability may involve two or more
random variables.
=>Example: Waiting time in queue may be considered as
composite of ‘arrival time’ and ‘service time’.

 A major random variable may be described in terms of one or


more parameters that are uncertain. These parameters are
themselves additional random variables that further
complicate probability evaluations.
=>Example: Relation between shear strength of spot weld and
diameter of spot weld.
Joint Probability Mass Function
pxi , y j   PX  xi , Y  y j 
• X an Y are both discrete random variables whose possible
values are, respectively, x1, x2, x3, ...... etc. and y1, y2, y2,
......etc.
• Individual probability mass functions of X and Y are easily
obtained from Joint probability mass function

P X  xi    PX  xi , Y  y j    pxi , y j 
j j

PY  y j    PX  xi , Y  y j    pxi , y j 


i i

You might also like