Chapter 3_Discrete radom variables and probability distribution
Chapter 3_Discrete radom variables and probability distribution
Example 1:
- Number of scratches on a surface
- Number of defective parts among 1000 tests
- Number of transmitted bits received in error.
3.1 Discrete Random Variables
Example 2:
A voice communication system for a business
contains 48 external lines. At a particular time,
the system is observed, and some of the lines are
being used. Let the random variable X denote
the number of lines in use. Then X can assume
any of the integer values 0 through 48. When the
system is observed, if 10 lines are in use, x = 10.
3.1 Discrete Random Variables
Example 3:
A semiconductor manufacturing process, two wafers from a lot are
tested. Each wafer is classified as pass or fail. Assume that the
probability that a wafer pass the test is 0.8 and that wafers are
independent. The sample space for the experiment and associated
probabilities are shown in Table 1
For example, because of the independence, the probability of the
outcome that the first wafer tested passes and the second wafer
tested fails, denoted as pf, is: P(pf) = 0.8x0.2=0.16
Outcome
Wafer 1 Wafer 2 Probability X
Pass Pass 0.64 2
Fail Pass 0.16 1
Pass Fail 0.16 1
Fail Fail 0.04 0
n
(b) f xi 1
i 1
(c) f xi P X xi
Example (Wafer Contamination):
Let the random variable X denote the number of
semiconductor wafers that need to be analyzed in
order to detect a large particle of contamination.
Assume that the probability that a wafer contains a
large particle is 0.01 and that the wafers are
independent.
Determine the probability distribution of X.
Solution:
Let p denote a wafer in which a large particle is
present, and let a denote a wafer in which it is absent.
The sample space of the experiment is infinite, and it
can be represented as all possible sequences that start
with a string of a’s and end with p. That is,
s = {p, ap, aap, aaap, aaaap, aaaaap, and so forth}
Consider a few special cases. We have P(X = 1) =
P(p) = 0.01. Also, using the independence
assumption,
P(X = 2) = P(ap) = 0.99(0.01) = 0.0099
Solution:
A general formula is
P X x
P aa...ap
0.99 x 1 0.01, for x 1, 2,3,...
Describing the probabilities associated with X in
terms of this formula is a simple method to define
the distribution of X in this example. Clearly f (x) ≥
0. The fact that the sum of the probabilities is 1 is
left as an exercise.
3.3 Cumulative Distribution Functions
(a) F x P X x f xi
xi x
(b) 0 F x 1
(c) If x y, then F x F y
3.3 Cumulative Distribution Functions
Solution:
The probability mass function at each point is the
change in the cumulative distribution function at the
point. Therefore,
f 2 0.2 0 0.2
f 0 0.7 0.2 0.5
f 2 1.0 0.7 0.3
3.3 Cumulative Distribution Functions
Example 2:
Suppose that a day’s production of 850 manufactured
parts contains 50 parts that do not conform to customer
requirements. Two parts are selected at random, without
replacement, from the batch. Let the random variable X
equal the number of nonconforming parts in the sample.
What is the cumulative distribution function of X ?
3.3 Cumulative Distribution Functions
Solution:
The first finding the probability mass function of X.
800 799
P X 0 0.886
850 849
800 50
P X 1 2 0.111
850 849
50 49
P X 2 0.003
850 849
3.3 Cumulative Distribution Functions
Solution:
Therefore F 0 P X 0 0.886
F 1 P X 1 0.886 0.111 0.997
F 2 P X 2 1
The cumulative distribution function of X
0 x0
0.886 0 x 1
F x
0.997 1 x 2
1 2 x
3.3 Cumulative Distribution Functions
E X xi f xi
xi
3.4 Mean and Variance of a Discrete
Random Variable
3.4.2 Variance of a Discrete Random Variable
Definition: The variance value of the discret
X
random variable 2
, denoted V X or
as is
V X E X xi f xi
2 2 2
xi
x f xi
2
i
2
xi
X
The standard deviation of is 2
3.4 Mean and Variance of a Discrete
Random Variable
The properties of mean and variance
a/ Let Y aX b
E Y aE X b
V Y V aX b a V X
2
b/
3.4 Mean and Variance of a Discrete
Random Variable
Example: The number of messages sent per hour over
a computer network has the following distribution:
x= number 10 11 12 13 14 15
of messages
f(x) 0.08 0.15 0.3 0.2 0.2 0.07
12
3.5 Discrete Uniform Distribution
Example 1:
The first digit of a part’s serial number is equally
likely to be any one of the digits 0 through 9. If
one part is selected from a large batch and X is
the first digit of the serial number, X has a
discrete uniform distribution with probability 0.1
for each value in R 0,1, 2,...,9 is,
. That
f x 0.1
for each value in R. The probability mass function
of X is shown in Fig 1.
3.5 Discrete Uniform Distribution
48 0
E X 24
2
48 0 1
2
1
14.14
12
3.6 Binomial Distribution
Definition: A random experiment consists of n
Bernoulli trials such that
(a) The trials are independent.
(b) Each trial results in only two possible
outcomes, labeled as “ success” and “failure”.
(c) The probability of a success in each trial,
denoted as p , remains constant has a binomial
random variable with paramiters 0 p 1 and
n 1, 2,.... Then probability mass function of X is
n x
f x p 1 p , x 0,1,..., n
n x
x
3.6 Binomial Distribution
Mean and Variance
If X is a binomial random variable with
parameters p and n,
E X np
2 V X np 1 p
3.6 Binomial Distribution
Example 1: Consider the following random
experiments and random variables:
1.Flip a coin 10 times. Let X = number of heads
obtained.
2. A worn machine tool produces 1% defective
parts. Let X = number of defective parts in the
next 25 parts produced.
3. Each sample of air has a 10% chance of
containing a particular rare molecule. Let X = the
number of air samples that contain the rare
molecule in the next 18 samples analyzed.
3.6 Binomial Distribution
Example 1:
4. Of all bits transmitted through a digital transmission
channel, 10% are received in error. Let X = the number of
bits in error in the next five bits transmitted.
5. A multiple-choice test contains 10 questions, each with
four choices, and you guess at each question. Let X = the
number of questions answered correctly.
6. In the next 20 births at a hospital, let X = the number of
female births.
7. Of all patients suffering a particular illness, 35%
experience improvement from a particular medication. In
the next 100 patients administered the medication, let X =
the number of patients who experience improvement.
3.6 Binomial Distribution
18
a/ P X 2 0.1 . 0.9 0.284
2 16
2
3.6 Binomial Distribution
Solution:
b/ The requested probability is
18
18
P X 4 0.1 . 0.9
x 18 x
x 4 x
18 18
0.1 . 0.9 0.1 . 0.9
4 14 5 13
4 5
18
... 0.1 . 0.9
18 0
18
3.6 Binomial Distribution
However, it is easier to use the complementary event,
3
18
P X 4 1 P X 4 1 0.1 . 0.9
x 18 x
x 0 x
18 18 15
1 0.1 . 0.9 ... 0.1 . 0.9
0 18 3
0 3
x 3 x
0.168 0.07 0.022 0.005
0.265
3.6 Binomial Distribution
Practical Interpretation:
Binomial random variables are used to model many
physical systems and probabilities for all such
models can be obtained from the binomial
probability mass function.
3.6 Binomial Distribution
Example 3: The phone lines to an airline reservation
system are occupied 40% of the time. Assume that
the events that the lines are occupied on successive
calls are independent. Assume that 10 calls are
placed to the airline.
(a) What is the probability that for exactly
three calls, the lines are occupied ?
(b) What is the probability that for at least one call, the
lines are not occupied ?
(c) What is the expected number of calls in which the
lines are all occupied?
3.6 Binomial Distribution
Solution:
(a) 0.215 (b) 0.994 (c) 4
3.7 Geometric and Negative Binomial
Distributions
3.7.1 Geometric distribution
Definition:
In a series of Bernoulli trials (independent trials
with constant probability p of a success), let the
random variable X denote the number of trials until
the first success. Then X is a geometric random
variable with parameter 0 p and 1
f x P X x 1 p
x 1
p, x 1, 2,...
3.7 Geometric and Negative Binomial
Distributions
Definition: If X is a geometric random variable
with parameter p
1
E X
p
1 p
2
2
p
3.7 Geometric and Negative Binomial
Distributions
Example:
The probability that a wafer contains a large
particle of contamination is 0.01. If it is assumed
that the wafers are independent, what is the
probability that exactly 125 wafers need to be
analyzed before a large particle is detected ?
3.7 Geometric and Negative Binomial Distributions
Solution:
Let X denote the number of sample analyzed until
a lagre particle is detected.
X is geometric random variable with p 0.01
The probability that exactly 125 wafers need to be
analyzed before a large particle is detected
P X 125 0.99 124 1
0.01 0.0029
3.7 Geometric and Negative Binomial
Distributions
3.7.2 Negative Binomial Distribution
Definition:
A generalization of a geometric distribution in
which the random variable is the number of
Bernoulli trials required to obtain r successes
results in the negative binomial distribution.
3.7 Geometric and Negative Binomial
Distributions
3.7.2 Negative Binomial Distribution
Definition: In a series of Bernoulli trials
(independent trials with constant probability p
of X
a success), let the randomr variable denote X the
number of trials until success. Then is a
negative binomial
0 p random
1 variable with
r 1, 2,3,...,
parameter x 1
and rand
f x P X x 1 p p
x r
r 1
x r , r 1, r 2,...
3.7 Geometric and Negative Binomial
Distributions
3.7.2 Negative Binomial Distribution
If is a negative binomial random variable with
parameter p and r ,
r
E X
p
r 1 p
V X
2
p2
3.7 Geometric and Negative Binomial
Distributions
Example 1:
The probability that a camera passes the test is 0.8,
and the cameras perform independently.
a/ What is the probability that the third failure is
obtained in six tests ?
b/ What is the probability that the third failure is
obtained in five fewer tests ?
3.7 Geometric and Negative Binomial
Distributions
Solution:
Let X is the number of camera tested until the third
failure.
a/ P X 6 C2 0.8 .0.2 0.04096
5 3 3
b/ P X 5 P X 3 P X 4
0.23 C23 0.23.0.81
0.0272
3.7 Geometric and Negative Binomial
Distributions
Example 2:
The probability of a successful optical alignment
in the assembly of an optical data storage product
is 0.8. Assume that the trials are independent.
a/ What is the probability that the first successful
alignment requires exactly four trials ?
b/ What is the probability that the first successful
alignment requires at most four trials ?
c/ What is the probability that the first successful
alignment requires at least four trials ?
3.7 Geometric and Negative Binomial
Distributions
Solution:
a/ P X 4 0.0064
b/ P X 4 0.9984
c/ P X 4 0.008
3.8 Hypergeometric Distribution
Definition: A set of N objects contains.
N K objects classified as failures.
A sample of size n objects is selected randomly
(without replacement) from the N objects where
K N and n N .
The random variable X that equals the number
of successes in the sample is a hypergeometric
random variable and
K N K
Cx Cn x
f x P X x N
Cn
x max 0, n K N to min K , n
3.8 Hypergeometric Distribution
Mean and Variance
If X is a hypergeometric random variable with parameters
N , K and n , then
E X np
N n
V X np 1 p
2
N 1
where p K / N
3.8 Hypergeometric Distribution
b/ P X 2 P X 2 P X 3 P X 4
0.407
c/ P X 1 0.804
3.8 Hypergeometric Distribution
Finite Population Correction Factor
The term in the variance of a hypergeometric
random variable
N n
N1
Is called the finite population correction factor.
3.8 Hypergeometric Distribution
e x
f x P X x , x 0,1,2,....
x!
E X ; 2 V X
3.9 Poisson Distribution
Example 1:
For the case of the thin copper wire, suppose that
the number of flaws follows a Poisson distribution
with a mean of 2.3 flaws per millimeter.
a/ Determine the probability of exactly two flaws
in 1 millimeter of wire.
b/ Determine the probability of 10 flaws in 5
millimeters of wire.
c/ Determine the probability of at least one flaw in
2 millimeters of wire.
3.9.Poisson Distribution
Solution
Let X equal the number of flaws in 1 millimeter of
wire.
e 2.3
2.3 2
a/ 2.3, P X 2 0.265
2!
11.5
11.5 10
e
b/ 11.5, P X 10 0.113
10!
c/ 4.6, P X 1 1 P X 0
e 4.6
4.6 0
1 0.9899
0!
3.9.Poisson Distribution
Example 2
The number of telephone calls that arrive at a
phone exchange is often modeled as a Poisson
random variable. Assume that on the average
there are 10 calls per hour.
a/ What is the probability that there are exactly 5
calls in one hour ?
b/ What is the probability that there are 3 or
fewer calls in one hour ?
3.9 Poisson Distribution
Example 2
c/ What is the probability that there are exactly 15
calls in two hours ?
d/ What is the probability that there are exactly 5
calls in 30 minutes ?
3.9 Poisson Distribution
Solution:
a/ 10, P X 5 0.0378
b/ 10, P X 3 P X 0 P X 1
P X 2 P X 3 0.01034
d/ 5, P X 5 0.175