Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
59 views

Chapter 6 - Random Variables and Probability Distributions

The document introduces key concepts related to random variables and probability distributions. It defines random variables as functions that map sample points to real numbers. Random variables can be discrete or continuous depending on whether their sample spaces are countable or uncountable. Probability distributions describe the probabilities associated with all possible values of a random variable. The probability mass function (PMF) defines a discrete probability distribution, while the probability density function (PDF) defines a continuous probability distribution. Expectations and common distributions are also introduced.

Uploaded by

Erick James Amos
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
59 views

Chapter 6 - Random Variables and Probability Distributions

The document introduces key concepts related to random variables and probability distributions. It defines random variables as functions that map sample points to real numbers. Random variables can be discrete or continuous depending on whether their sample spaces are countable or uncountable. Probability distributions describe the probabilities associated with all possible values of a random variable. The probability mass function (PMF) defines a discrete probability distribution, while the probability density function (PDF) defines a continuous probability distribution. Expectations and common distributions are also introduced.

Uploaded by

Erick James Amos
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 101

Σ

Random Variables
and Probability
Distributions
Stat 101 Bridging Program Module 2
Objectives
Objectives
At the end of this module, the student will be
able to:
• define the different terms associated with
random variables
• explain what random variables are
• differentiate discrete vs continuous
random variables
• construct the pmf of a discrete random
variable
Objectives
At the end of this module, the student will be
able to:
• compute for probabilities from the
probability distribution of a random variable
• compute expectations (mean, variance)
• understand the concepts behind
distributions
• identify possible distributions present in
real life
Before we
continue:
Before we continue:
Recall the definition of a function:
A function is a mapping from one set to
another.
An example of a function is 𝑓(𝑥) = 𝑥 + 2
which relates values of 𝑥 ∈ ℛ (a real number)
to 𝑓(𝑥) ∈ ℛ (another real number).
Here, 𝑓 relates the set of Reals to the set of
Reals.
Before we continue:
Consider the random experiment of flipping
a fair coin twice.
Ω = {𝐻𝐻, 𝐻𝑇, 𝑇𝐻, 𝑇𝑇}

Let X = number of heads observed.


Before we continue:
Consider the random experiment of
selecting 2 out of 5 electric components.
From the five electric components, it is
known that 2 are defective say D1 and D2 ,
and 3 are non-defective, say N1, N2, and N3.

Ω = {𝐷1 , 𝐷2 , 𝐷1 , 𝑁1 , 𝐷2 , 𝑁1 , 𝐷1 , 𝑁1 , 𝐷2 , 𝑁1 ,
𝐷1 , 𝑁2 , 𝐷2 , 𝑁2 , 𝑁1 , 𝑁2 , N1 , N3 , {N2 , N3 }}

Let Y = number of defectives in the sample.


Random Variables
Random Variable
A random variable is a function that maps
each sample point to a real number.

Note: We will be using an uppercase letter to


denote a random variable.
Example
A fair coin is tossed 𝑆𝑎𝑚𝑝𝑙𝑒 𝑝𝑜𝑖𝑛𝑡𝑠 𝑋
twice successively. We
HH 2
use 𝑋 to denote the
random variable, where HT 1
𝑋 is the number of TH 1
heads in the outcome TT 0
of the random
experiment. Thus, there
are 4 possible values of
𝑋.
Example
Consider the random experiment of
selecting 2 out of 5 electric components, of
which 2 are defective: D1, D2, and 3 are non-
defective: N1, N2, N3.
Ω = {𝐷1 , 𝐷2 , 𝐷1 , 𝑁1 , 𝐷2 , 𝑁1 , 𝐷1 , 𝑁1 , 𝐷2 , 𝑁1 ,
𝐷1 , 𝑁2 , 𝐷2 , 𝑁2 , 𝑁1 , 𝑁2 , N1 , N3 , {N2 , N3 }}

Let Y = number of defectives in the sample.


Example
Let Y = number of defectives in the sample.
𝑆𝑎𝑚𝑝𝑙𝑒 𝑝𝑜𝑖𝑛𝑡𝑠 𝒀 𝑆𝑎𝑚𝑝𝑙𝑒 𝑝𝑜𝑖𝑛𝑡𝑠 𝒀
{𝐷1 , 𝐷2 } 2 {𝐷1 , 𝑁3 } 1
{𝐷1 , 𝑁1 } 1 {𝐷2 , 𝑁3 } 1
{𝐷2 , 𝑁1 } 1 {𝑁1 , 𝑁2 } 0
{𝐷1 , 𝑁2 } 1 {𝑁1 , 𝑁3 } 0
{𝐷2 , 𝑁2 } 1 {𝑁2 , 𝑁3 } 0
Events in terms of
Random Variables
The concept of a random variable will
provide us with a new way of expressing
events.

𝑿 ≤ 𝒂 : event containing all sample points


whose associated value for the random
variable X is less than or equal to 𝒂, where 𝒂
is a specified real number
Events as Random
Variables
𝑿 > 𝒂 : event containing all sample points
whose associated value for the random
variable X is greater than 𝒂, where 𝒂 is a
specified real number

𝒂 < 𝑿 < 𝒃 : event containing all sample


points whose associated value for the
random variable X is in between a and b,
where a and b are specified real numbers
Example
Consider the random experiment of flipping
a fair coin twice. Ω = HH, HT, TH, TT

Let X = number of heads observed.


A = event that 2 heads were observed x=2
B = event that 1 head was observed x=1
C = event that no head was observed x=0
x greater or equal to 1
D = event that at least 1 head was observed
x less than or equal to 1
E = event that at most 1 head was observed
Example
Consider the random experiment of flipping
a fair coin twice. Ω = HH, HT, TH, TT

Let X = number of heads observed.


A = event that 2 heads were observed ⇒𝑋 =2
B = event that 1 head was observed ⇒𝑋=1
C = event that no head was observed ⇒𝑋=0
D = event that at least 1 head was observed ⇒ 𝑋 ≥ 1
E = event that at most 1 head was observed ⇒ 𝑋 ≤ 1
Example

Consider the random experiment of


selecting 2 out of 5 electric components, of
which 2 are defective (D1, D2) and and 3 are
non-defective (N1, N2, N3).
Ω = {𝐷1 , 𝐷2 , 𝐷1 , 𝑁1 , 𝐷2 , 𝑁1 , 𝐷1 , 𝑁1 , 𝐷2 , 𝑁1 ,
𝐷1 , 𝑁2 , 𝐷2 , 𝑁2 , 𝑁1 , 𝑁2 , N1 , N3 , {N2 , N3 }}

Let Y = number of defectives in the sample.


Types of Random
Variables
Discrete vs Continuous
DISCRETE CONTINUOUS
• countable • uncountable

• discrete points • continuous intervals

• no values can exist • has an infinite


number of possible
between two values between any
neighboring two observed values
categories
Discrete Random
Variable
Discrete Random Variable
• If a sample space contains a finite number
of possibilities or an unending sequence
with as many elements as there are whole
numbers, it is called a discrete sample
space.

• A random variable defined over a discrete


sample space is called a discrete random
variable.
Examples of
Discrete Sample Spaces
Experiment Sample Space
• tossing a coin • {H,T}
• rolling a die • {1,2,…,6}
• tossing a coin twice • {(H,H),(H,T),(T,H),(T,T)}
• sum of 2 dice rolls • {2,…,12}
• tossing a coin until a • {H, TH,TTH,TTTH,…}
head comes up
Basically, any set that is countable!
Continuous
Random Variable
Continuous Random
Variable
• If a sample space contains an infinite
number of possibilities equal to the number
of points on a line segment, it is called a
continuous sample space.

• A random variable defined over a


continuous sample space of possible values
is called a continuous random variable.
Examples of
Continuous Sample Spaces
Experiment Sample Space
• length of life of a • 𝑥 𝑥 ∈ (0, ∞)}
light bulb
• monthly company • 𝑥 𝑥 ∈ (−∞, ∞)}
net income
• 𝑥 𝑥 ∈ [0, ∞)}
• Time spent studying
in UP
Basically, any set that is uncountable
(many are in the form of intervals)!
Probability
Distributions
Cumulative Distribution
Function
The cumulative distribution function (cdf)
of a random variable X, denoted by F(⦁), is a
function defined for any real number 𝑥 as,
F 𝒙 = 𝑷(𝑿 ≤ 𝒙)
Remarks
• We can use the PMF to derive the CDF of
the discrete random variable.
• We can use the PDF to derive the CDF of
the continuous random variable.
However, this involves integration and is
not part of the coverage of Stat 101.
Discrete
Probability
Distributions
Discrete Probability
Distribution
A table or formula listing all possible values
that a discrete random variable can take on,
along with the associated probabilities, is
called a discrete probability distribution.

NOTE: The probabilities associated with all


possible values of a discrete random
variable must sum to 1.
Probability Mass Function
The probability mass function (pmf) of a
discrete random variable, denoted by p(⦁), is
a function defined for any real number 𝑥 as,
𝒑 𝒙 = 𝑷(𝑿 = 𝒙)

mass points: values of 𝑋 for which 𝑝 𝑥 > 0


Example
Consider the random experiment of tossing
a fair coin twice. Define 𝑋 = number of heads
observed.

a) construct the PMF of 𝑋


b) use the PMF to compute for the
probability that
i) at least two tosses result in heads
ii) at least one toss results in heads
Answer
Ω = {HH, HT, TH, TT}
𝑋 = no. of heads observed, 𝑥 ∈ {0, 1, 2}

a) construct the PMF of 𝑋


P(𝑋 = 0) = 1Τ4 , P(𝑋 = 1) = 1Τ2 , P(𝑋 = 2) = 1Τ4

𝑥 𝟎 𝟏 𝟐
𝑃(𝑋 = 𝑥) 1ൗ 1ൗ 1ൗ
4 2 4
Answer
b) use the PMF to compute for the
probability that
i) at least two tosses result in heads
P(𝑋 ≥ 2) = P(𝑋 = 2) = 1Τ4
ii) at least one toss results in heads
P(𝑋 ≥ 1) = P(𝑋 = 1) + P(𝑋 = 2) = 1Τ2 + 1Τ4 =

4
Example
Consider the random experiment of tossing
a fair die twice. Define 𝑌 = number of tosses
resulting in a number greater than 4.
a) construct the PMF of 𝑌
b) use the PMF to compute for:
i) probability that none of the tosses
results in a number greater than 4
ii) probability that at most one of the
tosses result in a number greater than 4.
Answer
Ω = {(u, v) | u, v ϵ {1, 2, 3, 4, 5, 6}}
𝑌 = no. of tosses resulting in a number > 4
𝑦 ∈ {0, 1, 2}

a) construct the PMF of 𝑌


P(𝑌 = 0) = 16Τ36 , P(𝑌 = 1) = 16Τ36 , P(𝑌 = 2) = 4Τ36

𝑦 0 1 2
𝑝(y) 16 16 4
36 36 36
Answer
b) use the PMF to compute for
i) probability that none of the tosses
results in a number greater than 4
P(Y= 0) = 16Τ36

ii) probability that at most one of the


tosses result in a number greater than 4
P(𝑌 ≤ 1) = P(𝑌 = 0) + P(𝑌 = 1) = 16Τ36 + 16Τ36
= 32Τ36
Remarks
• The PMF of a discrete random variable X
is usually presented in tabular form
whenever X has only a few mass points.
• The sum of 𝑝(𝑥) for all the mass points of
X is 1.
Continuous
Probability
Distributions
Probability Density
Function
The probability density function (pdf) of a
continuous random variable 𝑋, denoted by 𝑓(⦁)
is a function that is defined for any real number
𝑥 and satisfies the following:
a) 𝑓 𝑥 ≥ 0 for all 𝑥, i.e., the values of a pdf
are non-negative
b) the total area under its curve and above
the horizontal axis is equal to 1; and
c) P(𝑎 ≤ 𝑋 ≤ 𝑏) is the area bounded by the
curve, the x-axis, and the lines x=a and x=b
Example
A random variable which may take on any
value in the interval (0,1) with equal
probability has distribution:
1 0≤𝑥≤1
𝑓 𝑥 =ቊ
0 𝑒𝑙𝑠𝑒𝑤ℎ𝑒𝑟𝑒
Verify that this is a valid probability density
function.
Using the CDF gives area to the left

Oftentimes, it is easier to use the CDF


instead of the PDF in computing for
probabilities.
• P X ≤ 𝑎 = P X < a = F a left
• P X > a = P X ≥ 𝑎 = 1 − F a right
• P a < X < b = P a ≤ 𝑋 ≤ 𝑏 = 𝑃 (𝑎 < 𝑋
≤ 𝑏) = 𝑃 𝑎 ≤ 𝑋 < 𝑏 = F b − F(a)
Example
The CDF of a continuous random variable X is
as follows:
0 𝑥<0
𝐹 𝑥 = ቐ 𝑥3 0 ≤ 𝑥 < 1
1 𝑥≥1
Find the following probabilities using the CDF:
a) P(X>0.25)
b) P(0.3<X<0.7)
c) P(0.4≤X<1.25)
Answer
0 𝑥<0
𝐹 𝑥 = ቐ 𝑥3 0 ≤ 𝑥 < 1
1 𝑥≥1
63
a) P(X>0.25) = 1 – F(0.25) = 1 – 0.25 =
3
64
b) P(0.3<X<0.7) = F(0.7) – F(0.3) = (0.7)3 –
(0.3)3 = 0.316
c) P(0.4≤X<1.25) = F(1.25) – F(0.4) = 1-(0.4)3
= 0.936
Discrete vs Continuous
pmf pdf

DISCRETE CONTINUOUS
• countable • uncountable
• discrete points • continuous
intervals
• 𝑝(𝑥) is the •𝑓 𝑥 is the
probability mass probability density
function function
• 𝑝(𝑥) ≥ 0 •𝑓 𝑥 ≥0
• Σ𝑝(𝑥) = 1 • total area under the
curve is 1
Expectation and
Variance
The Concept of Expectation

Intuitively, the expected value of a random variable is


the return you can expect for some kind of action!

For example, when answering a 10-item true or false


quiz through sheer guesswork, we can expect that
there will be around 5 correct answers!
Expected Value of X
(Discrete RV)
Let 𝑋 be a discrete random variable with
probability mass function
𝒙 𝑥1 𝑥2 𝑥3 … 𝑥𝑛
𝒑 𝒙 = 𝑷(𝑿 = 𝒙) 𝑝 𝑥1 𝑝 𝑥2 𝑝 𝑥3 … 𝑝 𝑥𝑛

The expected value of 𝑋, also referred to as


the mean of 𝑋
𝑛

𝐸 𝑋 = 𝜇 = 𝑥1 𝑝 𝑥1 + 𝑥2 𝑝 𝑥2 + ⋯ + 𝑥𝑛 𝑝 𝑥𝑛 = ෍ 𝑥𝑖 𝑝(𝑥𝑖 )
𝑖=1
Example
Find the expected number of heads (𝑋) in
the experiment where you toss a coin twice.
𝑥 𝟎 𝟏 𝟐
𝑃(𝑋 = 𝑥) 1ൗ 1ൗ 1ൗ
4 2 4
Answer
1 2 1 2 2
𝐸 𝑋 =0 +1 +2 = + =1
4 4 4 4 4

In a tossing a fair coin, the number of heads


to be expected is 1.
Expected Value of g(X)
Let 𝑋 be a discrete random variable with
probability mass function 𝑝 𝑥 and 𝑛 mass
points. Suppose 𝑌 = 𝑔(𝑋) is a discrete
random variable, then the expected value of
𝑔(𝑋) is
𝑛

𝐸 𝑔(𝑋) = ෍ 𝑔(𝑥𝑖 )𝑝(𝑥𝑖 )


𝑖=1
Example
The PMF of a discrete random variable X is
as follows:
𝒙𝒊 -1 0 1 2
𝒑(𝒙𝒊 ) 1/10 2/10 5/10 2/10
Use the PMF to evaluate the following:
a) mean of X
b) E(X2)
c) E(X3)
Answer
a) mean of X = E(X) = −1 ∗ 0.10 + (0 ∗
0.20) + 1 ∗ 0.5 + 2 ∗ 0.2 = 0.8
b) E(X2) = −1 2 ∗ 0.10 + 02 ∗ 0.20 +
12 ∗ 0.5 + 22 ∗ 0.2 = 1.4
c) E(X3) = −1 3 ∗ 0.10 + 03 ∗ 0.20 +
13 ∗ 0.5 + 23 ∗ 0.2 = 2
Example
A used car dealer finds that in any day, the
probability of selling no car is 0.4, one car is
0.2, two cars is 0.15, 3 cars is 0.10, 4 cars is
0.08, five cars is .06, and six cars is 0.01. Let
X = number of cars sold and Y = 500 + 1500X
represent the salesman’s daily earnings. Find
the salesman’s expected daily earnings.
𝑥 0 1 2 3 4 5 6
𝑝(𝑥) 0.4 0.2 0.15 0.10 0.08 0.06 0.01
Example
A used car dealer finds that in any day, the
probability of selling no car is 0.4, one car is
0.2, two cars is 0.15, 3 cars is 0.10, 4 cars is
0.08, five cars is .06, and six cars is 0.01. Let
X = number of cars sold and Y = 500 + 1500X
represent the salesman’s daily earnings. Find
the salesman’s expected daily earnings.
𝑥 0 1 2 3 4 5 6
𝑦 500 2000 3500 5000 6500 8000 9500
𝑝(𝑥) 0.4 0.2 0.15 0.10 0.08 0.06 0.01
Answer
𝐸 𝑌 =𝐸 𝑔 𝑋 = 𝐸(500 + 1500𝑋)
= 500 + 1500 ∗ 0 ∗ 0.4
+ 500 + 1500 ∗ 1 ∗ 0.2
+ 500 + 1500 ∗ 2 ∗ 0.15
+ 500 + 1500 ∗ 3 ∗ 0.10
+ 500 + 1500 ∗ 4 ∗ 0.08
+ 500 + 1500 ∗ 5 ∗ 0.06
+ 500 + 1500 ∗ 6 ∗ 0.01 = P2720
Variance
Let 𝑋 be the random variable with mean 𝜇,
then the variance of 𝑋 is
𝜎 2 = 𝑉𝑎𝑟 𝑋 = 𝐸 𝑥 − 𝜇 2

with computational formula


2
𝑉𝑎𝑟 𝑋 = 𝐸 𝑋 2 − 𝐸 𝑋
Variance of X (Discrete RV)
Let 𝑋 be a discrete random variable with
probability mass function
𝒙 𝑥1 𝑥2 𝑥3 … 𝑥𝑛
𝒑 𝒙 = 𝑷(𝑿 = 𝒙) 𝑝 𝑥1 𝑝 𝑥2 𝑝 𝑥3 … 𝑝 𝑥𝑛

The variance of 𝑋 is
𝑛

𝜎 2 = 𝑉𝑎𝑟 𝑋 = 𝐸 𝑋 − 𝜇 2 = ෍ 𝑥𝑖 − 𝜇 2 𝑝(𝑥𝑖 )
𝑖=1
Standard Deviation
The standard deviation of X is the positive
square root of the variance.

As discussed before, the variance of X is a


measure of dispersion.
Example
Find the variance of 𝑋 in the experiment
where you toss a coin twice where 𝑋 = no. of
heads observed.
𝑥 𝟎 𝟏 𝟐
𝑃(𝑋 = 𝑥) 1ൗ 1ൗ 1ൗ
4 2 4
Answer
3

𝑉𝑎𝑟 𝑋 = ෍ 𝑥𝑖 − 1 2 𝑓(𝑥𝑖 )
𝑖=1
12 2
2 2
1
= 0−1 + 1−1 + 2−1
4 4 4
1 1 1
= +0+ =
4 4 2
Properties of the Mean
and Variance
• 𝐸 𝑋−𝜇 =0

• 𝐸 𝑎𝑋 + 𝑏 = 𝑎𝐸 𝑋 + 𝑏

• 𝐸 𝑋 + 𝑌 = 𝐸 𝑋 + 𝐸(𝑌)

• 𝐸 𝑋 − 𝑌 = 𝐸 𝑋 − 𝐸(𝑌)
Properties of the Mean
and Variance
• 𝑉𝑎𝑟 𝑋 = 𝐸 𝑋 2 − 𝐸 𝑋 2

• 𝑉𝑎𝑟 𝑎𝑋 + 𝑏 = 𝑎 2 𝑉𝑎𝑟 𝑋

• If 𝑋 and 𝑌 are independent, then

• 𝑉𝑎𝑟 𝑋 + 𝑌 = 𝑉𝑎𝑟 𝑋 + 𝑉𝑎𝑟 𝑌

• 𝑉𝑎𝑟 𝑋 − 𝑌 = 𝑉𝑎𝑟 𝑋 + 𝑉𝑎𝑟 𝑌


Example
Suppose X is a random variable with E(X2) =
10 and E(X) = 2. Define Y = 2X + 5 and Z = 2X –
5. Find the following.
a) Var (X)
b) E(Y) and Var (Y)
c) E(Z) and Var (Z)
Answer
E(X2) = 10 and E(X) = 2.Y = 2X + 5 and Z = 2X –
5.
a) Var (X) = E(X2) – (E(X))2 = 10 – 22 = 6
b) E(Y) = E(2X+5) = 2*E(X) + 5 = 2*2 + 5 = 9
Var (Y) = Var(2X+5) = 22Var(X) = 4*6 = 24
c) E(Z) = E(2X-5) = 2*E(X) – 5 = 2*2 -5 = -1
Var (Z) = Var (2X-5) = 22Var(X) = 4*6 = 24
Some Probability
Distributions
Parameter
A parameter is a constant that determines
the specific form of the probability
distribution. It carries vital information like
the shape of the distribution, the location of
the distribution, and other characterizations
Binomial
Distribution
Binomial Experiment
A binomial experiment is one that possesses
the following properties:
• The experiment consists of 𝑛 identical trials.
• Each trial results in one of two outcomes a
“success” or a “failure” (called a Bernoulli trial).
• The probability of success on a single trial is
equal to 𝑝 and remains the same from trial to
trial. The probability of failure is equal to 𝑞 =
1 − 𝑝.
• The trials are independent.
Binomial Distribution
This distribution describes how frequent a
“yes” appears in a “yes” / “no” (one or the
other) experiment repeated for a fixed
number of time.
Binomial Random Variable
The random variable involved in a binomial
experiment is called a binomial random
variable, which counts the number of
“successes” out of 𝑛 trials.
Example
Binomial experiment: a student randomly
guesses a multiple choice exam consisting
of 10 items.
Binomial random variable: number of
correct answers out of 10 items
Another Binomial random variable: number
of incorrect answers out of 10 items
Binomial Distribution
Let 𝑋 be the number of successes observed
in 𝑛 trials. From this, we can tell that 𝑋 is a
discrete random variable. 𝑋 is said to follow a
binomial distribution if its pmf is given by:
𝑛 𝑥
𝑝 𝑥 = 𝑝 1 − 𝑝 𝑛−𝑥 , 𝑥 = 0,1,2, … , 𝑛
𝑥
where 𝑛 and 𝑝 are the parameters of the
distribution, 𝑝 is any value between 0 and 1,
and 𝑛 is any positive integer
Binomial Distribution
We can simply write 𝑋~𝐵𝑖 𝑛, 𝑝 .

𝐸 𝑋 = 𝑛𝑝
𝑉𝑎𝑟 𝑋 = 𝑛𝑝 1 − 𝑝 = 𝑛𝑝𝑞
Example
Find the probability of obtaining (1) exactly
one head when you toss a coin twice, (2) no
head when you toss a coin twice, and (3) at
least one head when you toss a coin twice.
Example
Find the probability of obtaining exactly
three 2’s if an ordinary die is tossed
• Five times
• Ten times
Example
• How many times do we expect to get 2’s if
we toss a coin five times? Ten times?
• What is the variance?
Normal
Distribution
Normal Distribution
Normal Distribution
A continuous random variable 𝑋 is said to be
normally distributed or follows the normal
distribution if its PDF is
1 𝑥−𝜇 2

𝑓 𝑥 = 𝑒 2𝜎2
𝜎 2𝜋
If 𝑋 follows the normal distribution, we could
simply write this down as 𝑋~𝑁(𝜇, 𝜎 2 ).
Normal Distribution
• If 𝑋~𝑁 𝜇, 𝜎 2 , then E 𝑋 = 𝜇 and 𝑉𝑎𝑟 𝑋 = 𝜎 2 .
• The graph of the normal distribution is called
the normal curve/ bell curve.
• A normal distribution is an arrangement of a
dataset where most values cluster in the
middle and the rest taper off symmetrically
towards either extreme.
• The normal distribution has two parameters -
its mean and variance!
Properties
• The normal curve is bell-shaped and
symmetric about its mean 𝜇.
• The normal curve approaches the
horizontal axis asymptotically as we
proceed in either direction away from the
mean.
• The total area under the curve and above
the x-axis is equal to 1.
Supporting Slide Title
Normal Distribution with
Different Means and Variances
Standard Normal
Distribution
• If the normal random variable has 𝜇=0 and
𝜎 2 =1, it is called a standard normal random
variable, and is denoted by 𝑍.
• We can simply write this down as 𝑍~𝑁 0,1 .

• To find the probability that 𝑍 is less than or


equal to some value 𝑧, we can use what we
call a standard normal table or z-table.
Example
Using the standard normal table, find the
following:
• P(𝑍 ≤ 0)
• P(−1 ≤ 𝑍 ≤ 1)
• P(𝑍 < 1.77)
• P(𝑍 > 1.77)
Answer
Using the standard normal table, find the
following:
• P(𝑍 ≤ 0) = 0.5
• P( −1 ≤ 𝑍 ≤ 1 ) = P( 𝑍 ≤ 1 ) - P( 𝑍 < −1 ) =
0.8413 – 0.158 = 0.6826
• P(𝑍 < 1.77) = 0.9616
• P(𝑍 > 1.77) = 1 - P(𝑍 ≤ 1.77) = 1 - 0.9616 =
0.0384
Standard Score of a
Normal RV
• If 𝑋~𝑁(𝜇, 𝜎 2 ), then 𝑋 can be transformed into
a standard normal random variable 𝑍
through the following transformation:
𝑋−𝜇
𝑍=
𝜎
• This is what we call the z-score.
• Now, the 𝑍 above is already a standard
normal random variable.
Standard Score of a
Normal RV
• Whenever 𝑋 is between the values 𝑥1 and
𝑥2 , the random variables 𝑍 will fall between
the corresponding values 𝑧1 and 𝑧2 . Thus,
𝑃 𝑥1 < 𝑋 < 𝑥2 = 𝑃 𝑧1 < 𝑍 < 𝑧2
Example
Suppose 𝑋~𝑁𝑜𝑟𝑚𝑎𝑙 𝜇 = 5, 𝜎 2 = 4 . Find:
• P(𝑋 ≤ 0)
• P(3 < 𝑋 ≤ 6)
• P(𝑋 ≥ 6)
Answer
Suppose 𝑋~𝑁𝑜𝑟𝑚𝑎𝑙 𝜇 = 5, 𝜎 2 = 4 . Find:
𝑋−𝜇 0−5
•P 𝑋≤0 =P ≤ = 𝑃(𝑍 ≤ −2.5)
𝜎 2
= 0.0062
3−5 𝑋−𝜇 6−5
• P(3 < 𝑋 ≤ 6) = P( < ≤ )
2 𝜎 2
= P(𝑍 ≤ 0.5) - P(𝑍 ≤ −1.0)
= 0.6915 – 0.1587
= 0.5328
Answer
Suppose 𝑋~𝑁𝑜𝑟𝑚𝑎𝑙 𝜇 = 5, 𝜎 2 = 4 . Find:
𝑋−𝜇 6−5
• P(𝑋 ≥ 6)= P( ≥ )= P(𝑍 ≥ 0.5)
𝜎 2
= 1 - P(𝑍 < 0.5)
= 1 – 0.6915 = 0.3085
Example
An automatic soda dispenser is regulated so that it
dispenses an average of 200 ml per cup. Suppose
the amount of drink dispensed is normally
distributed with a standard deviation equal to 15ml
• What fraction of the cups will contain less than 224
ml?
• How many cups will be dispensed containing soda
between 191 and 209 ml?
• What is the probability that a cup will overflow if
230 ml cups are used?
Answer
Suppose 𝑋~𝑁(𝜇 = 200, 𝜎 2 = 152 ) , find:
𝑋−𝜇 224−200
• P(𝑋 < 224) = P( < ) = P(𝑍 < 1.6) = 0.9452
𝜎 15
191−200 𝑋−𝜇 209−200
• P(191 < 𝑋 ≤ 209) = P( < ≤ )
15 𝜎 15
= P(𝑍 ≤ 0.6) - P(𝑍 ≤ −0.6) = 0.7257 – 0.2743
=0.4514
𝑋−𝜇 230−200
• P(𝑋 > 230) = P( > ) = P(𝑍 > 2.0)
𝜎 15
= 1 - P(𝑍 < 2.0)
= 1 – 0.9772 = 0.0228
Exercises
Exercises

The probability that a certain kind of component


will survive a shock test is 3/4. Find the
probability that exactly 2 of the next 4
components tested survive.
Exercises

Given a normal distribution with mean = 40 and


standard deviation = 6, find the value of x that
has
• 45% of the area to the left, and
• 14% of the area to the right
Exercises

In the manufacture of wristwatches, the error of measuring


time is carefully studied. For a selected watch, its
measurement error is a random variable defined as 𝑋 =
𝑇𝑖𝑚𝑒 𝑖𝑛 𝑤𝑎𝑡𝑐ℎ − (𝑇𝑟𝑢𝑒 𝑡𝑖𝑚𝑒) which is measured in
seconds. Assume that 𝑋~𝑁 0, 𝜎 2 = 100
• What proportion of these watches will show time that
is within 20 seconds from the correct time of day?
• If the acceptable range of measurement error is from
–c to c (where c is a positive real number), for what
value of c would 95% of the watches be acceptable?
Σ
Random Variables
and Probability
Distributions
END OF MODULE 2

You might also like