Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
2 views

Chapter 3_Discrete radom variables and probability distribution

Chapter 3 covers discrete random variables and their probability distributions, including definitions, examples, and key concepts such as probability mass functions and cumulative distribution functions. It also explains how to calculate means and variances for discrete random variables and discusses various discrete probability distributions like binomial and Poisson distributions. The chapter aims to equip readers with the skills to determine probabilities, select appropriate distributions, and perform calculations relevant to discrete random variables.

Uploaded by

baotochi87
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Chapter 3_Discrete radom variables and probability distribution

Chapter 3 covers discrete random variables and their probability distributions, including definitions, examples, and key concepts such as probability mass functions and cumulative distribution functions. It also explains how to calculate means and variances for discrete random variables and discusses various discrete probability distributions like binomial and Poisson distributions. The chapter aims to equip readers with the skills to determine probabilities, select appropriate distributions, and perform calculations relevant to discrete random variables.

Uploaded by

baotochi87
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 70

Chapter 3:

DISCRETE RANDOM VARIABLES


AND
PROBABILITY DISTRIBUTION
Chapter outline
3.1 Discrete Random Variables
3.2 Probability Distributions and Probability Mass
Functions
3.3 Cumulative Distribution Functions
3.4 Mean and Variance of a Discrete Random Varibale
3.5 Discrete Uniform Distribution
3.6 Binomial Distribution
3.7 Geometric and Negative Binomial Distributions
3.8 Hypergeometric Distribution
3.9 Poisson Distribution
Learning Objectives
After careful study of this chapter, you should be able to do the
following:
1. Determine probabilities from probability mass functions and the
reverse
2. Determine probabilities and probability mass functions from
cumulative distribution functions and the reverse
3. Calculate means and variances for discrete random variables
4. Understand the assumptions for some common discrete
probability distributions
5. Select an appropriate discrete probability distribution to
calculate probabilities in specific applications
6. Calculate probabilities and determine means and variances for
some common discrete probability distributions
3.1 Discrete Random Variables
Definition:
A discrete random variable is a random variable
with a finite ( or coutably infinite ) range.

Example 1:
- Number of scratches on a surface
- Number of defective parts among 1000 tests
- Number of transmitted bits received in error.
3.1 Discrete Random Variables

Example 2:
A voice communication system for a business
contains 48 external lines. At a particular time,
the system is observed, and some of the lines are
being used. Let the random variable X denote
the number of lines in use. Then X can assume
any of the integer values 0 through 48. When the
system is observed, if 10 lines are in use, x = 10.
3.1 Discrete Random Variables
Example 3:
A semiconductor manufacturing process, two wafers from a lot are
tested. Each wafer is classified as pass or fail. Assume that the
probability that a wafer pass the test is 0.8 and that wafers are
independent. The sample space for the experiment and associated
probabilities are shown in Table 1
For example, because of the independence, the probability of the
outcome that the first wafer tested passes and the second wafer
tested fails, denoted as pf, is: P(pf) = 0.8x0.2=0.16
Outcome
Wafer 1 Wafer 2 Probability X
Pass Pass 0.64 2
Fail Pass 0.16 1
Pass Fail 0.16 1
Fail Fail 0.04 0

Table 1: Wafer Tests


3.1 Discrete Random Variables
Example 4:
Define the random variable X to be the number of
contamination particles on a wafer in semiconductor
manufacturing. Although wafers possess a number of
characteristics, the random X summarizes the wafer
only in terms of the number of particles.
The possible values of X are integers from 0 up to
some large value that represents the maximum number
of particles that can be found on one of the wafers. If
this maximum number is very large, we might simply
assume that the range of X is the set of integers from 0
to  .
Note that more than one random variable can be
defined on a sample space.
3.2 Probability Distributions and
Probability Mass Functions
3.2.1 Probability Distributions
Definition:
The probability distribution of a random variable
X is a description of the probabilities associated
with the possible values of X.

For a discrete random variable, the distribution is often


specified by just a list of the possible values along with
the probability of each. In some cases, it is convenient to
express the probability in terms of a formula.
3.2 Probability Distributions and
Probability Mass Functions
Example 1 (Digital Channel):
There is a chance that a bit transmitted through a digital
transmission channel is received in error. Assume that the
probability that a bit in error is 0.1.
Let X equal the number of bits in error in the next four bits
transmitted. The possible values for X are {0,1,2,3,4}.
Suppose that:
P(X=0) = 0.6561; P(X=1) = 0.2916; P(X=2) = 0.0486
P(X=3) = 0.0036; P(X=4) = 0.0001.
The probability distribution of X is specified by the
possible values along with the probability of each.
3.2 Probability Distributions and
Probability Mass Functions

Figure: Probability distribution for bits in


error.
3.2 Probability Distributions and
Probability Mass Functions
3.2.2 Probability Mass Functions
Definition: For a discrete random variable Xwith
possible values x1 , x2 ,..., xn .
A probability mass function is a function such that
(a) f  xi  0

n
(b)  f  xi  1
i 1

(c) f  xi  P  X xi 
Example (Wafer Contamination):
Let the random variable X denote the number of
semiconductor wafers that need to be analyzed in
order to detect a large particle of contamination.
Assume that the probability that a wafer contains a
large particle is 0.01 and that the wafers are
independent.
Determine the probability distribution of X.
Solution:
Let p denote a wafer in which a large particle is
present, and let a denote a wafer in which it is absent.
The sample space of the experiment is infinite, and it
can be represented as all possible sequences that start
with a string of a’s and end with p. That is,
s = {p, ap, aap, aaap, aaaap, aaaaap, and so forth}
Consider a few special cases. We have P(X = 1) =
P(p) = 0.01. Also, using the independence
assumption,
P(X = 2) = P(ap) = 0.99(0.01) = 0.0099
Solution:
A general formula is
P  X x 
P aa...ap 
0.99 x 1 0.01, for x 1, 2,3,...
Describing the probabilities associated with X in
terms of this formula is a simple method to define
the distribution of X in this example. Clearly f (x) ≥
0. The fact that the sum of the probabilities is 1 is
left as an exercise.
3.3 Cumulative Distribution Functions

Definition: The cumulative distribution function of


a discrete random variable X ,denoted as F  x  , is
F  x  P  X  x   f  xi 
xi x

(a) F  x  P  X  x   f  xi 
xi x

(b) 0 F  x  1

(c) If x  y, then F  x  F  y 
3.3 Cumulative Distribution Functions

Example 1 (Cumulative Distribution Function)


Determine the probability mass function of X from
the following cumulative distribution function:
0 x2
0.2  2  x  0

F  x  
0.7 0 x  2
1 2 x

Cumulative Distribution Function


3.3 Cumulative Distribution Functions

Solution:
The probability mass function at each point is the
change in the cumulative distribution function at the
point. Therefore,
f  2  0.2  0 0.2
f 0  0.7  0.2 0.5
f 2  1.0  0.7 0.3
3.3 Cumulative Distribution Functions
Example 2:
Suppose that a day’s production of 850 manufactured
parts contains 50 parts that do not conform to customer
requirements. Two parts are selected at random, without
replacement, from the batch. Let the random variable X
equal the number of nonconforming parts in the sample.
What is the cumulative distribution function of X ?
3.3 Cumulative Distribution Functions
Solution:
The first finding the probability mass function of X.
800 799
P  X 0    0.886
850 849

800 50
P  X 1 2   0.111
850 849

50 49
P  X 2    0.003
850 849
3.3 Cumulative Distribution Functions
Solution:
Therefore F 0  P  X 0  0.886
F 1 P  X 1 0.886  0.111 0.997
F 2  P  X 2  1
The cumulative distribution function of X
0 x0
0.886 0 x  1

F  x  
0.997 1 x  2
1 2 x
3.3 Cumulative Distribution Functions

Figure: Cumulative distribution function for example 2


3.4 Mean and Variance of a Discrete
Random Variable
3.4.1 Mean:
Definition: The mean or expected value of the
discrete random variable X , denoted as  or E  X  is

 E  X   xi f  xi 
xi
3.4 Mean and Variance of a Discrete
Random Variable
3.4.2 Variance of a Discrete Random Variable
Definition: The variance value of the discret
X
random variable  2
, denoted V  X or
as is

 V  X  E  X      xi    f  xi 
2 2 2

xi

 x f  xi   
2
i
2

xi

X
The standard deviation of is    2
3.4 Mean and Variance of a Discrete
Random Variable
The properties of mean and variance
a/ Let Y aX  b
E Y  aE  X   b
V Y  V aX  b  a V  X 
2

b/
3.4 Mean and Variance of a Discrete
Random Variable
Example: The number of messages sent per hour over
a computer network has the following distribution:

x= number 10 11 12 13 14 15
of messages
f(x) 0.08 0.15 0.3 0.2 0.2 0.07

Determine the mean and standard deviation of the


number of messages sent per hour ?
3.4 Mean and Variance of a Discrete
Random Variable
Solution
x= number 10 11 12 13 14 15
of messages
f(x) 0.08 0.15 0.3 0.2 0.2 0.07

E  X  10 0.08   110.15   ...  15 0.07  12.5


V  X  102 0.08   112 0.15   ...  152 0.07   12.52
1.85
  V  x   1.85 1.36
3.4 Mean and Variance of a Discrete
Random Variable
Example 2:

Figure 1: A probability distribution can be viewed


as a loading with the mean equal to the balance
point. Parts (a) and (b) illustrate equal means, but
Part (a) illustrates a larger variance.
3.4 Mean and Variance of a Discrete
Random Variable
Example 3:

Figure 2: The probability distribution illustrated in


Parts (a) and (b) differ even though they have equal
means and equal variances.
3.5 Discrete Uniform Distribution
Definition
A random variable X has a discrete uniform
distribution if each of the n values in its range, say,
x1 , x2 ,..., xn has equal probability. Then
1
f  xi  
n
3.5 Discrete Uniform Distribution

Mean and Variance


Suppose X is a discrete uniform random variable on
the consecutive integers a, a  1, a  2,..., b, for
a b
. X
The mean of is ba
 E  X  
2
X
isb  a  1  1
2
The variance of
 
2

12
3.5 Discrete Uniform Distribution
Example 1:
The first digit of a part’s serial number is equally
likely to be any one of the digits 0 through 9. If
one part is selected from a large batch and X is
the first digit of the serial number, X has a
discrete uniform distribution with probability 0.1
for each value in R  0,1, 2,...,9  is,
. That
f  x  0.1
for each value in R. The probability mass function
of X is shown in Fig 1.
3.5 Discrete Uniform Distribution

Figure 1: Probability mass function for a discrete


uniform random variable.
3.5 Discrete Uniform Distribution
Example 2:
Let the random variable X denote the number of
the voice lines that are in use at a particular time.
Assume that X is a discrete uniform random
variable with a range of 0 to 48. Then,

48  0
E X   24
2
48  0  1
2
1
 14.14
12
3.6 Binomial Distribution
Definition: A random experiment consists of n
Bernoulli trials such that
(a) The trials are independent.
(b) Each trial results in only two possible
outcomes, labeled as “ success” and “failure”.
(c) The probability of a success in each trial,
denoted as p , remains constant has a binomial
random variable with paramiters 0  p  1 and
n 1, 2,.... Then probability mass function of X is
 n x
f  x    p 1  p  , x 0,1,..., n
n x

 x
3.6 Binomial Distribution
Mean and Variance
If X is a binomial random variable with
parameters p and n,
 E  X  np

 2 V  X  np 1  p 
3.6 Binomial Distribution
Example 1: Consider the following random
experiments and random variables:
1.Flip a coin 10 times. Let X = number of heads
obtained.
2. A worn machine tool produces 1% defective
parts. Let X = number of defective parts in the
next 25 parts produced.
3. Each sample of air has a 10% chance of
containing a particular rare molecule. Let X = the
number of air samples that contain the rare
molecule in the next 18 samples analyzed.
3.6 Binomial Distribution
Example 1:
4. Of all bits transmitted through a digital transmission
channel, 10% are received in error. Let X = the number of
bits in error in the next five bits transmitted.
5. A multiple-choice test contains 10 questions, each with
four choices, and you guess at each question. Let X = the
number of questions answered correctly.
6. In the next 20 births at a hospital, let X = the number of
female births.
7. Of all patients suffering a particular illness, 35%
experience improvement from a particular medication. In
the next 100 patients administered the medication, let X =
the number of patients who experience improvement.
3.6 Binomial Distribution

Figure: Binomial distributions for selected values of n and p.


3.6 Binomial Distribution
Example 2: Each sample of water has a 10% chance
of containing a particular organic pollutant.Assume
that the samples are independent with regard to the
presence of the pollutant, in the next 18 samples.
a/ Find the probability that exactly 2 contain the
pollutant.
b/ Determine the probability that at least four
samples contain the pollutant.
c/ Determine the probability that 3  X  7
3.6 Binomial Distribution
Solution:
Let X the number of samples that contain the
pollutant in the next 18 samples analyzed.
Then X is a binomial random variable with p 0.1
and n 18 . Then

 18 
a/ P  X 2    0.1 . 0.9  0.284
2 16

2
3.6 Binomial Distribution
Solution:
b/ The requested probability is
18
 18 
P  X 4     0.1 . 0.9 
x 18 x

x 4  x 

 18   18 
  0.1 . 0.9     0.1 . 0.9  
4 14 5 13

4 5
 18 
...    0.1 . 0.9 
18 0

 18 
3.6 Binomial Distribution
However, it is easier to use the complementary event,
3
 18 
P  X 4  1  P  X  4  1     0.1 . 0.9 
x 18 x

x 0  x 

  18   18  15 
1     0.1 . 0.9   ...    0.1 . 0.9  
0 18 3

 0  3 

1  0.15  0.3  0.284  0.168  0.098


3.6 Binomial Distribution
Solution: 6
 18 
c/ P 3  X  7     0.1 . 0.9 
x 18 x

x 3 x 
0.168  0.07  0.022  0.005

0.265
3.6 Binomial Distribution
Practical Interpretation:
Binomial random variables are used to model many
physical systems and probabilities for all such
models can be obtained from the binomial
probability mass function.
3.6 Binomial Distribution
Example 3: The phone lines to an airline reservation
system are occupied 40% of the time. Assume that
the events that the lines are occupied on successive
calls are independent. Assume that 10 calls are
placed to the airline.
(a) What is the probability that for exactly
three calls, the lines are occupied ?
(b) What is the probability that for at least one call, the
lines are not occupied ?
(c) What is the expected number of calls in which the
lines are all occupied?
3.6 Binomial Distribution
Solution:
(a) 0.215 (b) 0.994 (c) 4
3.7 Geometric and Negative Binomial
Distributions
3.7.1 Geometric distribution
Definition:
In a series of Bernoulli trials (independent trials
with constant probability p of a success), let the
random variable X denote the number of trials until
the first success. Then X is a geometric random
variable with parameter 0  p and 1
f  x  P  X  x  1  p 
x 1
p, x 1, 2,...
3.7 Geometric and Negative Binomial
Distributions
Definition: If X is a geometric random variable
with parameter p
1
 E  X  
p

1 p
  2
2

p
3.7 Geometric and Negative Binomial
Distributions
Example:
The probability that a wafer contains a large
particle of contamination is 0.01. If it is assumed
that the wafers are independent, what is the
probability that exactly 125 wafers need to be
analyzed before a large particle is detected ?
3.7 Geometric and Negative Binomial Distributions

Solution:
Let X denote the number of sample analyzed until
a lagre particle is detected.
X is geometric random variable with p 0.01
The probability that exactly 125 wafers need to be
analyzed before a large particle is detected
P  X 125  0.99 124 1
0.01 0.0029
3.7 Geometric and Negative Binomial
Distributions
3.7.2 Negative Binomial Distribution
Definition:
A generalization of a geometric distribution in
which the random variable is the number of
Bernoulli trials required to obtain r successes
results in the negative binomial distribution.
3.7 Geometric and Negative Binomial
Distributions
3.7.2 Negative Binomial Distribution
Definition: In a series of Bernoulli trials
(independent trials with constant probability p
of X
a success), let the randomr variable denote X the
number of trials until success. Then is a
negative binomial
0  p random
1 variable with
r 1, 2,3,...,
parameter x  1
 and rand
f  x  P  X  x   1  p  p
x r

 r  1
x r , r  1, r  2,...
3.7 Geometric and Negative Binomial
Distributions
3.7.2 Negative Binomial Distribution
If is a negative binomial random variable with
parameter p and r ,
r
 E  X  
p

r 1  p 
 V  X  
2

p2
3.7 Geometric and Negative Binomial
Distributions

Example 1:
The probability that a camera passes the test is 0.8,
and the cameras perform independently.
a/ What is the probability that the third failure is
obtained in six tests ?
b/ What is the probability that the third failure is
obtained in five fewer tests ?
3.7 Geometric and Negative Binomial
Distributions
Solution:
Let X is the number of camera tested until the third
failure.
a/ P  X 6  C2 0.8 .0.2 0.04096
5 3 3

b/ P  X  5  P  X 3  P  X 4 
0.23  C23 0.23.0.81
0.0272
3.7 Geometric and Negative Binomial
Distributions
Example 2:
The probability of a successful optical alignment
in the assembly of an optical data storage product
is 0.8. Assume that the trials are independent.
a/ What is the probability that the first successful
alignment requires exactly four trials ?
b/ What is the probability that the first successful
alignment requires at most four trials ?
c/ What is the probability that the first successful
alignment requires at least four trials ?
3.7 Geometric and Negative Binomial
Distributions
Solution:
a/ P  X 4  0.0064

b/ P  X 4  0.9984

c/ P  X 4  0.008
3.8 Hypergeometric Distribution
Definition: A set of N objects contains.
N  K objects classified as failures.
A sample of size n objects is selected randomly
(without replacement) from the N objects where
K N and n N .
The random variable X that equals the number
of successes in the sample is a hypergeometric
random variable and
K N K
Cx Cn x
f  x  P  X  x   N
Cn
x max  0, n  K  N  to min  K , n
3.8 Hypergeometric Distribution
Mean and Variance
If X is a hypergeometric random variable with parameters
N , K and n , then
 E  X  np
 N  n
 V  X  np 1  p 
2

 N  1 

where p K / N
3.8 Hypergeometric Distribution

Example: A batch of parts contains 100 from a


local supplier of tubing and 200 from a supplier
of tubing in the next state. If four parts are
selected randomly and without replacement.
a/ What is the probability they are all from the
local supplier ?
b/ What is the probability that two or more parts
in the sample are from the local supplier ?
c/ What is the probability that at least one part in
the sample is from the local supplier ?
3.8 Hypergeometric Distribution
Solution:
Let X equal the number of parts in the sample from
local supplier.
a/ P  X 4  0.0119

b/ P  X 2  P  X 2   P  X 3  P  X 4 
0.407
c/ P  X 1 0.804
3.8 Hypergeometric Distribution
Finite Population Correction Factor
The term in the variance of a hypergeometric
random variable
N n
N1
Is called the finite population correction factor.
3.8 Hypergeometric Distribution

Comparison of hypergeometric and binomial distributions.


3.9 Poisson Distribution
Definition:
The random variable X that equals the number of
events in a Poisson process is a Poisson random
variable with parameter 0   , and

e    x
f  x  P  X  x   , x 0,1,2,....
x!

λ is the mean number of events per unit length


3.9 Poisson Distribution
Mean and Variance
If X is a Poisson random variable with parameter 

E  X   ;  2 V  X  
3.9 Poisson Distribution
Example 1:
For the case of the thin copper wire, suppose that
the number of flaws follows a Poisson distribution
with a mean of 2.3 flaws per millimeter.
a/ Determine the probability of exactly two flaws
in 1 millimeter of wire.
b/ Determine the probability of 10 flaws in 5
millimeters of wire.
c/ Determine the probability of at least one flaw in
2 millimeters of wire.
3.9.Poisson Distribution
Solution
Let X equal the number of flaws in 1 millimeter of
wire.
e 2.3
 2.3 2

a/  2.3, P  X 2   0.265
2!
11.5 
 11.5 10
e
b/  11.5, P  X 10   0.113
10!

c/  4.6, P  X 1 1  P  X 0 
e 4.6 
 4.6 0

1  0.9899
0!
3.9.Poisson Distribution

Example 2
The number of telephone calls that arrive at a
phone exchange is often modeled as a Poisson
random variable. Assume that on the average
there are 10 calls per hour.
a/ What is the probability that there are exactly 5
calls in one hour ?
b/ What is the probability that there are 3 or
fewer calls in one hour ?
3.9 Poisson Distribution
Example 2
c/ What is the probability that there are exactly 15
calls in two hours ?
d/ What is the probability that there are exactly 5
calls in 30 minutes ?
3.9 Poisson Distribution
Solution:
a/  10, P  X 5  0.0378
b/  10, P  X 3 P  X 0   P  X 1
 P  X 2   P  X 3 0.01034

c/  20, P  X 15  0.0516

d/  5, P  X 5  0.175

You might also like