Pro Ch2 (2019 20)
Pro Ch2 (2019 20)
Pro Ch2 (2019 20)
2019–2020
PROBABILITY & STATISTICS ITC 0 / 37
Contents
1 Random Variables
3 Expected Values
1 Random Variables
3 Expected Values
Definition 1
Let S be a sample space of random experiments. A function
X : S → R is called a random variable (rv).
D = {x = X(s) : s ∈ S} is called range of random variable X. It
is the set of all possible value of X.
If D is countable set then X is called discrete random variable.
If D is uncountable set then X is called continuous random
variable.
NOTE
In this whole chapter, X will be denoted a discrete random variable
(drv). Otherwise, it needs to be specified.
Example 1
A rat is selected at random from a cage and its sex is determined. The
set of possible outcomes is female and male. Thus, the outcome space
is S = {Female, Male} = {F, M }. We defind X : S → R by X(F ) = 0
and X(M ) = 1. Then we have X is a random variable, moreover, it is
a discrete random variable since D = {0, 1}.
Example 2
Suppose a pair of fair dices are rolled. Then the sample space is
S = {(i, j) : 1 ≤ i, j ≤ 6}. We define X : S → R by X(i, j) = i + j.
Then X is a discrete random variable. The set of all possible value of
X is D = {2, . . . , 12}.
1 Random Variables
3 Expected Values
Definition 2
The probability distribution or probability mass function
(pmf ) of a discrete random variable is denoted pX (x) and defined by
by
pX (x) = P (X = x) = P (s ∈ S : X(s) = x).
Properties
The pmf pX (x) must satisfy the following properties:
1 pX (x) ≥ 0, ∀x ∈ D,
X
2 pX (x) = 1.
x∈D
Example 3
Let X be the sum of the up-faces on a roll of a pair of fair 6-sided dice,
each with the numbers 1 through 6 on it. The sample space is
S = {(i, j) : 1 ≤ i, j ≤ 6}. Because the dice are fair, P [{(i, j)}] = 1/36.
The random variable X is X(i, j) = i + j. The set of all possible value
of X is D = {2, . . . , 12}. By enumeration, the pmf of X is given by
X 2 3 4 5 6 7 8 9 10 11 12
1 2 3 4 5 6 5 4 3 2 1
pX (x) 36 36 36 36 36 36 36 36 36 36 36
Definition 3
For a dev X, the cumulative distribution function (cdf ), denoted
by FX , is defined by
X
FX (x) = P (X ≤ x) = pX (y).
y≤x
Example 4
From Example 3, we have
X 2 3 4 5 6 7 8 9 10 11 12
1 2 3 4 5 6 5 4 3 2 1
pX (x) 36 36 36 36 36 36 36 36 36 36 36
Theorem 1
Let FX (x) be a cdf of drv X. Then,
1 If a < b, then FX (a) ≤ FX (b), (F is a nondecreasing function).
2 lim FX (x) = 0, (the lower limit of FX is 0).
x→−∞
3 lim FX (x) = 1, (the upper limit of FX is 1).
x→+∞
4 lim FX (x) = FX (x0 ), (FX is right continuous).
x→x+
0
Theorem 2
Let FX be a cdf of drv X. Then,
1 P (a < X ≤ b) = FX (b) − FX (a).
2 P (X = x) = FX (x) − FX (x− ), where FX (x− ) = lim FX (z)
z→x−
Example 5
Given the cdf FX as follow.
0,
x<0
FX (x) = x/2, 0 ≤ x < 1
1, 1 ≤ x.
Then,
and
P (X = 1) = FX (1) − FX (1− ) = 1 − 1/2 = 1/2.
The value 1/2 equals the value of the step of FX at x = 1.
Example 6
An automobile service facility specializing in engine tune-ups knows
that 45% of all tune-ups are done on four-cylinder automobiles, 40% on
six-cylinder automobiles, and 15% on eight-cylinder automobiles. Let
X be number of cylinders on the next car to be tuned.
(a) What is the pmf of X?
(b) Draw both a line graph and a probability histogram for the pmf of
part (a).
(c) What is the probability that the next car tuned has at least six
cylinders? More than six cylinders?
1 Random Variables
3 Expected Values
Definition 4
The expected value or mean value of X, denoted by E(X) or µX or
just µ, is X
E(X) = x.pX (x)
x∈D
Theorem 3
The expected value of any function h(X), denoted by E[h(X)], is given
by X
E[h(X)] = h(x).pX (x).
x∈D
Definition 5
2 , or just σ 2 , is defined
The variance of drv X, denoted by V (X) or σX
by X
V (X) = E[(X − µ)2 ] = (x − µ)2 .pX (x).
x∈D
Theorem 4
1 V (X) = E(X 2 ) − [E(X)]2
2 V (aX + b) = a2 V (X)
3 σaX+b = |a|σX .
PROBABILITY & STATISTICS ITC 16 / 37
The Variance of X
Example 7
The pmf of the amount of memory X(GB) in a purchased flash drive
was given in as
x 1 2 4 8 18
p(x) .05 .10 .35 .40 .10
Compute the following:
(a) E(X).
(b) V (X) directly from the definition.
(c) The standard deviation of X.
(d) V (X) using the shortcut formula.
Definition 6
The moment-generating function (mgf ) of X, denoted by M (t), is
defined by
M (t) = E etX
if it exists.
Theorem 5
1 M (n) (t) = E[X n etX ]
2 E[X] = M 0 (0)
3 V (X) = M 00 (0) − [M 0 (0)]2
1 Random Variables
3 Expected Values
Definition 7
A binomial experiment is an experiment satisfying the following
properties:
1 It consists of a sequence of n smaller experiments called trials,
where n is a (non-random) constant.
2 Each trial can result in one of the same two possible outcomes
(dichotomous trials), which we generically denote by success and
failure.
3 The trials are independent
4 The probability of success is constant from trial to trial, we denote
this probability by p.
Definition 8
The binomial random variable X associated with a binomial
experiment consisting of n trials is defined by
Theorem 6
If X ∼ Bin(n, p), then the pmf of the binomial rv X is given by
(
Cnx .px (1 − p)n−x , x = 0, 1, 2, . . . , n,
pX (x) =
0, otherwise.
Theorem 7
If X ∼ Bin(n, p), then
√
1 E(X) = np 3 σX = npq
2 V (X) = npq 4 M (t) = (q + pet )n
where q = 1 − p.
Example 8
When circuit boards used in the manufacture of compact disc players
are tested, the long-run percentage of defectives is 5%. Let X = the
number of defective boards in a random sample of size n = 25, so
X ∼ Bin(25, 0.05).
(a) Determine P (X ≤ 2).
(b) Determine P (X ≥ 5).
(c) Determine P (1 ≤ X ≤ 4).
(d) What is the probability that none of the 25 boards is defective?
(e) Calculate the expected value and standard deviation of X.
1 Random Variables
3 Expected Values
Definition 9
The hypergeometric experiment is an experiment satisfying the
following properties:
1 The population or set to be sampled consists of N individuals,
objects, or elements (a finite population).
2 Each individual can be characterized as a success (S) or a failure
(F ), and there are M successes in the population.
3 A sample of n individuals is selected without replacement in such
a way that each subset of size n is equally likely to be chosen.
Definition 10
The hypergeometric random variable X associated with
hypergeometric experiment is defined by
Theorem 8
If X ∼ Hp(n, M, N ), then the pmf of the hypergeometrix rv X is given
by
C x .C n−x
pX (x) = M nN −M
CN
for all integers x satisfying max(0, n − N + M ) ≤ x ≤ min(n, M ).
PROBABILITY & STATISTICS ITC 26 / 37
The Hypergeometric Distribution
Theorem 9
If X ∼ Hp(n, M, N ), then
M
1 E(X) = n.
N
N −n M M
2 V (X) = n 1−
N −1 N N
Example 9
Five individuals from an animal population thought to be near
extinction in a certain region have been caught, tagged, and released to
mix into the population. After they have had an opportunity to mix, a
random sample of 10 of these animals is selected. Let X = the number
of tagged animals in the second sample. If there are actually 25
animals of this type in the region, what is the probability that
(a)X = 2? (b)X ≤ 2? and then find E(X) and V (X).
PROBABILITY & STATISTICS ITC 27 / 37
The relationship between Hypergeometric and Binomial
Distributions
Example 10
A manufacturer of automobile tires reports that among a shipment of
5000 sent to a local distributor, 1000 are slightly blemished. If one
purchases 10 of these tires at random from the distributor, what is the
probability that exactly 3 are blemished?
Answer: 0.2013
Definition 11
The negative binomial experiment is an experiment satisfying the
following conditions:
1 The experiment consists of a sequence of independent trials.
2 Each trial can result in either a success (S) or a failure (F ).
3 The probability of success is constant from trial to trial, so
P (S on trial i) = p for i = 1, 2, 3, . . ..
4 The experiment continues (trials are performed) until a total of r
successes have been observed, where r is a specified positive
integer.
Definition 12
The negative binomial rv X associated with the negative binomial
experiment is defined by
Theorem 10
If X ∼ Nb(r, p), then the pmf of the negative binomial rv X is given by
r−1
pX (x) = Cx+r−1 pr (1 − p)x
where x = 0, 1, 2, . . . .
Remark:
if r = 1 then X is a geometric random variable with pmf
Theorem 11
If X ∼ Nb(r, p), then
r(1 − p)
1 E(X) =
p
r(1 − p)
2 V (X) =
p2
r
p
3 M (t) = .
1 − et + pet
Answer: 0.0096
Example 12
Suppose that during practice a basketball player can make a free throw
80% of the time. Furthermore, assume that a sequence of free-throw
shooting can be thought of as independent Bernoulli trials. Let X
equal the minimum number of free throws that this player must
attempt to make a total of 10 shots.
(a) Find the pmf of X.
(b) Find the mean, variance, and standard deviation of X.
(c) Find P (X = 12).
PROBABILITY & STATISTICS ITC 33 / 37
Contents
1 Random Variables
3 Expected Values
Definition 13
A drv X is called Poisson distribution or Poisson rv with
parameter λ(λ > 0) if the pmf of X is
e−λ λx
pX (x) = x = 0, 1, 2, . . .
x!
We write X ∼ Po(λ).
Theorem 12
Suppose X ∼ Bin(n, p). When n → ∞, p → 0, and np → λ remains
constant, then
X ∼ Po(λ).
Example 13
In a certain industrial facility, accidents occur infrequently. It is known
that the probability of an accident on any given day is 0.005 and
accidents are independent of each other.
(a) What is the probability that in any given period of 400 days there
will be an accident on one day?
(b) What is the probability that there are at most three days with an
accident?
Theorem 13
If X ∼ Po(λ), then
1 E(X) = λ
2 V (X) = λ
t −1)
3 M (t) = eλ(e .
Example 14
If a publisher of nontechnical books takes great pains to ensure that its
books are free of typographical errors, so that the probability of any
given page containing at least one such error is .005 and errors are
independent from page to page, what is the probability that one of its
400-page novels will contain exactly one page with errors? At most
three pages with errors?