(Ebooks PDF) Download Fundamentals of Applied Probability and Random Processes Second Edition Oliver Ibe Full Chapters
(Ebooks PDF) Download Fundamentals of Applied Probability and Random Processes Second Edition Oliver Ibe Full Chapters
(Ebooks PDF) Download Fundamentals of Applied Probability and Random Processes Second Edition Oliver Ibe Full Chapters
com
https://ebookname.com/product/fundamentals-of-applied-
probability-and-random-processes-second-edition-oliver-ibe/
OR CLICK BUTTON
DOWLOAD NOW
https://ebookname.com/product/fundamentals-of-stochastic-
networks-1st-edition-oliver-c-ibe/
https://ebookname.com/product/probability-and-random-
processes-1st-edition-venkatarama-krishnan/
https://ebookname.com/product/probability-statistics-and-random-
processes-3rd-edition-t-veerarajan/
https://ebookname.com/product/probability-random-variables-and-
stochastic-processes-4th-edition-athanasios-papoulis-unnikrishna-
pillai/
Applied probability and stochastic processes 2nd
Edition Beichelt F.
https://ebookname.com/product/applied-probability-and-stochastic-
processes-2nd-edition-beichelt-f/
https://ebookname.com/product/fundamentals-of-probability-with-
stochastic-processes-3rd-edition-saeed-ghahramani/
https://ebookname.com/product/probability-on-graphs-random-
processes-on-graphs-and-lattices-1-publ-repr-with-corr-edition-
textbooks/
https://ebookname.com/product/fundamentals-of-renewable-energy-
processes-second-edition-aldo-v-da-rosa/
https://ebookname.com/product/probability-and-statistics-by-
example-volume-2-markov-chains-a-primer-in-random-processes-and-
their-applications-v-2-1st-edition-yuri-suhov/
Fundamentals of Applied Probability
and Random Processes
This page intentionally left blank
Fundamentals of Applied
Probability and Random
Processes
2nd Edition
Oliver C. Ibe
University of Massachusetts, Lowell, Massachusetts
v
vi Contents
APPENDIX................................................................................................ 427
BIBLIOGRAPHY........................................................................................ 429
INDEX ...................................................................................................... 431
Acknowledgment
The first edition of this book was well received by many students and profes-
sors. It had both Indian and Korean editions and received favorable reviews
that include the following: “The book is very clear, with many nice examples
and with mathematical proofs whenever necessary. Author did a good job!
The book is one of the best ones for self-study.” Another comment is the
following:
“This book is written for professional engineers and students who want to do
self-study on probability and random processes. I have spent money and time
in reading several books on this topic and almost all of them are not for self-
study. They lack real world examples and have end-of-chapter problems that
are boring with mathematical proofs. In this book the concepts are explained
by taking real world problems and describing clearly how to model them.
Topics are well-balanced; the depth of material and explanations are very
good, the problem sets are intuitive and they make you to think, which makes
this book unique and suitable for self-study. Topics which are required for
both grad and undergrad courses are covered in this book very well. If you are
reading other books and are stuck when applying a concept to a particular
problem, you should consider this book. Problem solving is the only way to get
familiar with different probability models, and this book fulfills that by taking
problems in the real world examples as opposed to some theoretical proofs.”
These are encouraging reviews of a book that evolved from a course I teach at
the University of Massachusetts, Lowell. I am very grateful to those who wrote
these wonderful unsolicited anonymous reviews in Amazon.com. Their obser-
vations on the structure of the book are precisely what I had in mind in writing
the book.
I want to extend my sincere gratitude to my editor, Paula Callaghan of Elsevier,
who was instrumental in the production of this book. I thank her for the effort
she made to get the petition for the second edition approved. I also want to
thank Jessica Vaughan, the Editorial Project Manager, for her ensuring timely
production of the book.
xiv
Acknowledgment xv
So many students have used the first edition of this book at UMass Lowell and
have provided useful information that led to more clarity in the presentation of
the material in the book. They are too many to name individually, so I say
“thank you” to all of them as a group.
Finally, I would like to thank my wife, Christina, for patiently bearing with me
while the book was being revised. I would also like to appreciate the encour-
agement of our children Chidinma, Ogechi, Amanze and Ugonna. As always,
they are a source of joy to me and my wife.
Preface to the Second Edition
xvi
Preface to the Second Edition xvii
the students solve the exercises at the end of each chapter. Some mathematical
knowledge is assumed, especially freshman calculus and algebra.
This second edition of the book differs from the first edition in a few ways. First,
the chapters have been slightly rearranged. Specifically, statistics now comes
before random processes to enable students understand the basic principles
of probability and statistics before studying random processes. Second,
Chapter 11 has been split into two chapters: Chapter 8, which deals with
descriptive statistics; and Chapter 9, which deals with inferential statistics.
Third, the new edition includes more application-oriented examples to enable
students to appreciate the application of probability and random processes in
science, engineering and management. Finally, after teaching the subject every
semester for the past eleven years, I have been able to identify several pain
points that hinder student understanding of probability and random processes,
and I have introduced several new “smart” methods of solving the problems to
help ease the pain.
The book is divided into three parts as follows:
Chapter 6 deals with functions of random variables including linear and power
functions of one random variable, moments of functions of one random var-
iable, sums of independent random variables, the maximum and minimum of
two independent random variables, two functions of two random variables,
laws of large numbers, the central limit theorem, and order statistics
Chapter 7 discusses transform methods that are useful in computing moments
of random variables. In particular, it discusses the characteristic function, the
z-transform of the probability mass functions of discrete random variables
and the s-transform of the probability distribution functions of continuous ran-
dom variables.
Chapter 8 presents an introduction to descriptive statistics and discusses such
topics as measures of central tendency, measures of spread, and graphical
displays.
Chapter 9 presents an introduction to inferential statistics and discusses such
topics as sampling theory, estimation theory, hypothesis testing, and linear
regression analysis.
Chapter 10 presents an introduction to random processes. It discusses classifi-
cation of random processes; characterization of random processes including
the autocorrelation function of a random process, autocovariance function,
crosscorrelation function and crosscovariance function; stationary random
processes; ergodic random processes; and power spectral density.
Chapter 11 discusses linear systems with random inputs. It also discusses the
autoregressive moving average process.
Chapter 12 discusses special random processes including the Bernoulli process,
Gaussian process, random walk, Poisson process and Markov process.
The author has tried different formats in presenting the different chapters of the
book. In one particular semester we were able to go through all the chapters except
Chapter 12. However, it was discovered that this put a lot of stress on the students.
Thus, in subsequent semesters an attempt was made to cover all the topics in
Parts 1 and 2 of the book, and a few selections from Part 3. The instructor can
try different formats and adopt the one that works best for him or her.
The beginning of a solved example is indicated by a short line and the end of the
solution is also indicated by a short line. This is to separate the continuation of
a discussion preceding an example from the example just solved.
Preface to First Edition
xix
xx Preface to First Edition
1.1 INTRODUCTION
Probability deals with unpredictability and randomness, and probability the-
ory is the branch of mathematics that is concerned with the study of random
phenomena. A random phenomenon is one that, under repeated observation,
yields different outcomes that are not deterministically predictable. However,
these outcomes obey certain conditions of statistical regularity whereby the
relative frequency of occurrence of the possible outcomes is approximately
predictable. Examples of these random phenomena include the number of
electronic mail (e-mail) messages received by all employees of a company
in one day, the number of phone calls arriving at the university’s switchboard
over a given period, the number of components of a system that fail within a
given interval, and the number of A’s that a student can receive in one
academic year.
According to the preceding definition, the fundamental issue in random phe-
nomena is the idea of a repeated experiment with a set of possible outcomes or
events. Associated with each of these events is a real number called the proba-
bility of the event that is related to the frequency of occurrence of the event in a
long sequence of repeated trials of the experiment. In this way it becomes obvi-
ous that the probability of an event is a value that lies between zero and one,
and the sum of the probabilities of the events for a particular experiment should
sum to one.
This chapter begins with events associated with a random experiment. Then it
provides different definitions of probability and considers elementary set theory
and algebra of sets. Also, it discusses basic concepts in combinatorial analysis
that will be used in many of the later chapters. Finally, it discusses how probability
is used to compute the reliability of different component configurations in a
system.
Ω ¼ f1, 2, 3, 4, 5, 6g (1.1)
The event “the outcome of the toss of a die is an even number” is the subset of Ω
and is defined by
E ¼ f2, 4, 6g (1.2)
For a second example, consider a coin tossing experiment in which each toss
can result in either a head (H) or tail (T). If we toss a coin three times and
let the triplet xyz denote the outcome “x on first toss, y on second toss and z
on third toss,” then the sample space of the experiment is
The event “one head and two tails” is the subset of Ω and is defined by
n In a single coin toss experiment with sample space Ω ¼ {H, T}, the event
E ¼ {H} is the event that a head appears on the toss and E ¼ {T} is the
event that a tail appears on the toss.
n If we toss a coin twice and let xy denote the outcome “x on first toss and y
on second toss,” where x is head or tail and y is head or tail, then the
1.2 Sample Space and Events 3
sample space is Ω ¼ {HH, HT, TH, TT}. The event E ¼ {HT, TT} is the event
that a tail appears on second toss.
n If we measure the lifetime of an electronic component, such as a chip, the
sample space consists of all nonnegative real numbers. That is,
Ω ¼ fxj0 x < 1g
The event that the lifetime is not more than 7 hours is defined as follows:
E ¼ fxj0 x 7g
n If we toss a die twice and let the pair (x, y) denote the outcome “x on first
toss and y on second toss,” then the sample space is
8 9
>
> ð1, 1Þ ð1, 2Þ ð1, 3Þ ð1, 4Þ ð1, 5Þ ð1, 6Þ >
>
>
> >
>
>
> ð2, 1Þ ð2, 2Þ ð2, 3Þ ð2, 4Þ ð2, 5Þ ð2, 6Þ >
>
>
> >
>
>
> >
>
< ð3, 1Þ ð3, 2Þ ð3, 3Þ ð3, 4Þ ð3, 5Þ ð3, 6Þ =
Ω¼ (1.5)
> ð4, 1Þ
> >
ð4, 2Þ ð4, 3Þ ð4, 4Þ ð4, 5Þ ð4, 6Þ >
>
> >
>
>
> >
> ð5, 1Þ
>
> ð5, 2Þ ð5, 3Þ ð5, 4Þ ð5, 5Þ ð5, 6Þ >
>
>
>
>
: >
;
ð6, 1Þ ð6, 2Þ ð6, 3Þ ð6, 4Þ ð6, 5Þ ð6, 6Þ
For any two events A and B of a sample space Ω, we can define the following
new events:
That is, for any set of mutually exclusive events defined on the same space,
the probability of at least one of these events occurring is the sum of their
respective probabilities.
The ratio nA/n is called the relative frequency of event A. While the relative-
frequency definition of probability is intuitively satisfactory for many practical
problems, it has a few limitations. One such limitation is the fact that the exper-
iment may not be repeatable, especially when we are dealing with destructive
testing of expensive and/or scarce resources. Also, the limit may not exist.
EXAMPLE 1.1
Two fair dice are tossed. Find the probability of each of the following events:
Solution:
We first define the sample space of the experiment. If we let the pair (x, y) denote the outcome “first
die comes up x and second die comes up y,” then the sample space is given by equation (1.5). The
total number of sample points is 36. We evaluate the three probabilities using the classical
definition method.
a. Let A1 denote the event that the sum of the outcomes of the two dice is equal to seven.
Then A1 ¼ {(1, 6), (2, 5), (3, 4), (4, 3), (5, 2), (6, 1)}. Since the number of sample points in the
event is 6, we have that P[A1] ¼ 6/36 ¼ 1/6.
b. Let B denote the event that the sum of the outcomes of the two dice is either seven or
eleven, let A1 be as defined in part (a), and let A2 denote the event that the sum of the
outcomes of the two dice is eleven. Then, A2 ¼ {(6, 5), (5, 6)} with 2 sample points. Thus,
P[A2] ¼ 2/36 ¼ 1/18. Since B is the union of A1 and A2, which are mutually exclusive events,
we obtain
1 1 2
P½B ¼ P½A1 [ A2 ¼ P½A1 + P½A2 ¼ + ¼
6 18 9
c. Let C denote the event that the outcome of the second die is greater than the outcome of
the first die. Then
( )
ð1, 2Þ,ð1,3Þ,ð1,4Þ,ð1,5Þ,ð1,6Þ,ð2,3Þ,ð2,4Þ,ð2,5Þ,
C¼
ð2, 6Þ,ð3,4Þ,ð3,5Þ,ð3,6Þ,ð4,5Þ,ð4,6Þ,ð5,6Þ
D ¼ fð2, 2Þ, ð2, 4Þ, ð2, 6Þ, ð4, 2Þ, ð4, 4Þ, ð4, 6Þ, ð6, 2Þ, ð6, 4Þ, ð6, 6Þg
C
(1,6) (2,6) (3,6) (4,6) (5,6) (6,6)
(6,5)
A2
(1,5) (2,5) (3,5) (4,5) (5,5)
FIGURE 1.1
Sample Space for Example 1.1
Message
Source Channel Sink
FIGURE 1.2
Model of a Communication System
1.4 Applications of Probability 7
The message generated by the source conveys some information. One of the
objectives of information theory is to quantify the information content of mes-
sages. This quantitative measure enables communication system designers to
provision the channel that can support the message. A good measure of the
information content of a message is the probability of occurrence of the mes-
sage: The higher the probability of occurrence of a message, the less informa-
tion it conveys; and the smaller the probability of occurrence of a message, the
greater its information content. For example, the message “the temperature is
90o F in the northeastern part of the United States in December” has more
information content than the message “the temperature is 10o F in the north-
eastern part of the United States in December” because the second message is
more likely to occur than the first.
Thus, information theory uses the probability of occurrence of events to convey
information about those events. Specifically, let P[A] denote the probability of
occurrence of some event A. Then the information content of A, I(A), is given by
1
IðAÞ ¼ log 2 ¼ log 2 ð1Þ log 2 ðP½AÞ ¼ 0 log 2 ðP½AÞ ¼ log 2 ðP½AÞ
P ½ A
From the preceding equation we observe that the greater the probability that an
event will occur, the smaller the information content of the event, as we stated
earlier. If event A is certain to occur, then P[A] ¼ 1 and I(A) ¼ log2(1) ¼ 0. Sim-
ilarly, the smaller the probability that an event will occur, the greater is the
information content of the event. In particular, if event A is certain not to occur,
P[A] ¼ 0 and I(A) ¼ log2(0) ¼ 1. Thus, when an event that is not expected to
occur does actually occur, its information content is infinite.
Similarly, let B denote the set of positive odd numbers less than 10. Then
B ¼ f1, 3, 5, 7, 9g
If k is an element of the set E, we say that k belongs to (or is a member of) E and
write k 2 E. If k is not an element of the set E, we say that k does not belong to (or
is not a member of) E and write k 62 E.
A set A is called a subset of set B, denoted by A B, if every member of A is a
member of B. Alternatively, we say that the set B contains the set A by writing
B A.
The set that contains all possible elements is called the universal set Ω.
The set that contains no elements (or is empty) is called the null set Ø (or
empty set). That is,
Ø ¼ fg
EXAMPLE 1.2
Let Ω ¼ {1, 2, 3, 4, 5, 6, 7, 8, 9, 10}, A ¼ {1, 2, 4, 7} and B ¼ {1, 3, 4, 6}. Then A ¼ f3, 5, 6, 8, 9, 10g and
B ¼ f2, 5, 7, 8, 9, 10g:
Union: The union of two sets A and B, denoted by A [ B, is the set containing all
the elements of either A or B or both A and B. That is,
10 C HA PT E R 1 Basic Probability Concepts
A [ B ¼ fkjk 2 A or k 2 Bg
Ω Ω Ω
B
A B A
A
A A∩B
A∪B
Ω Ω
B B
A A A⊂B
A−B
FIGURE 1.3
Venn Diagrams of Different Set Operations
union, intersection, and difference operations on two sets A and B. The univer-
sal set is represented by the set of points inside a rectangle. The sets A and B are
represented by the sets of points inside oval objects.
n Commutative law for unions: A [ B ¼ B [ A, which states that the order of the
union operation on two sets is immaterial.
n Commutative law for intersections: A \ B ¼ B \ A, which states that the order
of the intersection operation on two sets is immaterial.
n Associative law for unions: A [ (B [ C) ¼ (A [ B) [ C, which states that in
performing the union operation on three sets, we can proceed in two
ways: We can first perform the union operation on the first two sets to
obtain an intermediate result and then perform the operation on the
result and the third set. The same result is obtained if we first perform the
operation on the last two sets and then perform the operation on the first
set and the result obtained from the operation on the last two sets.
n Associative law for intersections: A \ (B \ C) ¼ (A \ B) \ C, which states
that in performing the intersection operation on three sets, we can
proceed in two ways: We can first perform the intersection operation on
the first two sets to obtain an intermediate result and then perform
the operation on the result and the third set. The same result is obtained
if we first perform the operation on the last two sets and then perform
the operation on the first set and the result obtained from the
operation on the last two sets.
n First distributive law: A \ (B [ C) ¼ (A \ B) [ (A \ C), which states that
the intersection of a set A and the union of two sets B and C is equal to
12 C HA PT E R 1 Basic Probability Concepts
equal to Ω.
n A \ Ω ¼ A, which states that the intersection of A and the universal set Ω
is equal to A.
n A [ Ø ¼ A, which states that the union of A and the null set is equal to A.
A B A B
I II III
(a) (b)
FIGURE 1.4
Venn Diagram of A [ B
14 C HA PT E R 1 Basic Probability Concepts
a Venn diagram in which the left circle represents event A and the right
circle represents event B. In Figure 1.4b we divide the diagram into three
mutually exclusive sections labeled I, II, and III where section I represents
all points in A that are not in B, section II represents all points in both
A and B, and section III represents all points in B that are not in A.
From Figure 1.4b, we observe that:
A [ B ¼ I [ II [ III
A ¼ I [ II
B ¼ II [ III
8. We can extend Property 7 to the case of three events. If A1, A2 and A3 are
three events in Ω, then 3
P½A1 [ A2 [ A3 ¼ P½A1 + P ½A2 + P½A3 P½A1 \ A2 P ½A1 \ A3
(1.13)
P½A2 \ A3 + P½A1 \ A2 \ A3
That is, to find the probability that at least one of the n events occurs, first add
the probability of each event, then subtract the probabilities of all possible two-
way intersections, then add the probabilities of all possible three-way intersec-
tions, and so on.
event that the sum of the two tosses is 7 and we observe that the first toss is 4.
Based on this fact, the six possible and equally likely outcomes of the two tosses
are (4, 1), (4, 2), (4, 3), (4, 4), (4, 5) and (4, 6). In the absence of the information
that the first toss is 4, there would have been 36 sample points in the sample
space. But with the information on the outcome of the first toss, there are now
only 6 sample points.
Let A denote the event that the sum of the two dice is 7, and let B denote the
event that the first die is 4. The conditional probability of event A given event B,
denoted by P[A|B], is defined by
P ½A \ B
P½AjB ¼ P½B 6¼ 0 (1.15)
P ½ B
EXAMPLE 1.3
A bag contains 8 red balls, 4 green, and 8 yellow balls. A ball is drawn at random from the bag and
it is found not to be one of the red balls. What is the probability that it is a green ball?
Solution:
Let G denote the event that the selected ball is a green ball and let R denote the event that it is not a
red ball. Then, P[G] ¼ 4/20 ¼ 1/5 since there are 4 green balls out of a total of 20 balls, and P R ¼
12=20 ¼ 3=5 since there are 12 balls out of 20 that are not red. Now,
P G\R
P GjR ¼
P R
But if the ball is green and not red, it must be green. Thus, we have that G \ R ¼ fGg because G is
a subset of R: Thus,
P G\R P½G 1=5 1
P GjR ¼ ¼ ¼ ¼
P R P R 3=5 3
EXAMPLE 1.4
A fair coin was tossed two times. Given that the first toss resulted in heads, what is the probability
that both tosses resulted in heads?
16 C HA PT E R 1 Basic Probability Concepts
Solution:
Because the coin is fair, the four sample points of the sample space Ω ¼ {HH, HT, TH, TT} are
equally likely. Let X denote the event that both tosses came up heads; that is, X ¼ {HH}. Let Y denote
the event that the first toss came up heads; that is, Y ¼ {HH, HT}. Because X is a subset of Y,
the probability that both tosses resulted in heads, given that the first toss resulted in heads, is
given by
Proposition 1.1
Let {A1, A2, . . ., An} be a partition of the sample space Ω, and suppose each one of the events A1,
A2, . . ., An has nonzero probability of occurrence. Let B be any event defined in Ω. Then
Proof
The proof is based on the observation that because {A1, A2, . . ., An} is a partition of Ω, the set {B \ A1,
B \ A2, . . ., B \ An} is a partition of the event B because if B occurs, then it must occur in conjunction
with one of the Ai ’ s. Thus, we can express B as the union of n mutually exclusive events. That is,
B ¼ ðB \ A1 Þ [ ðB \ A2 Þ [ ... [ ðB \ An Þ
From our definition of conditional probability, P[B \ Ai] ¼ P[B|Ai]P[Ai], which exists because we
assumed in the Proposition that the events A1, A2, . . ., An have nonzero probabilities. Substituting
the definition of conditional probabilities we obtain the desired result:
The above result is defined as the law of total probability of event B, which will be useful in the
remainder of the book.
1.7 Conditional Probability 17
EXAMPLE 1.5
A student buys 1000 chips from supplier A, 2000 chips from supplier B, and 3000 chips from sup-
plier C. He tested the chips and found that the probability that a chip is defective depends on the
supplier from where it was bought. Specifically, given that a chip came from supplier A, the prob-
ability that it is defective is 0.05; given that a chip came from supplier B, the probability that it is
defective is 0.10; and given that a chip came from supplier C, the probability that it is defective is
0.10. If the chips from the three suppliers are mixed together and one of them is selected at ran-
dom, what is the probability that it is defective?
Solution:
Let P[A], P[B] and P[C] denote the probability that a randomly selected chip came from supplier A,
B and C respectively. Also, let P[D|A] denote the conditional probability that a chip is defective,
given that it came from supplier A; let P[D|B] denote the conditional probability that a chip is defec-
tive, given that it came from supplier B; and let P[D|C] denote the conditional probability that a chip
is defective, given that it came from supplier C. Then the following are true:
P½DjA ¼ 0:05
P½DjB ¼ 0:10
P½DjC ¼ 0:10
1000 1
P½A ¼ ¼
1000 + 2000 + 3000 6
2000 1
P½B ¼ ¼
1000 + 2000 + 3000 3
3000 1
P½C ¼ ¼
1000 + 2000 + 3000 2
Let P[D] denote the unconditional probability that a randomly selected chip is defective. Then,
from the law of total probability of D we have that
We now go back to the general discussion. Suppose event B has occurred but we
do not know which of the mutually exclusive and collectively exhaustive events
A1, A2, . . ., An holds true. The conditional probability that event Ak occurred,
given that B occurred, is given by
P ½ A k \ B P ½Ak \ B
P½Ak jB ¼ ¼ n
P ½ B X
P½BjAi P ½Ai
i¼1
where the second equality follows from the law of total probability of event B.
Since P[Ak \ B] ¼ P[B|Ak]P[Ak], the above equation can be rewritten as follows:
18 C HA PT E R 1 Basic Probability Concepts
EXAMPLE 1.6
In Example 1.5, given that a randomly selected chip is defective, what is the probability that it came
from supplier A?
Solution:
Using the same notation as in Example 1.5, the probability that the randomly selected chip came
from supplier A, given that it is defective, is given by
P½D \ A P½DjAP½A
P½AjD ¼ ¼
P½D P½DjAP½A + P½DjBP½B + P½DjCP½C
ð0:05Þð1=6Þ
¼
ð0:05Þð1=5Þ + ð0:10Þð1=3Þ + ð0:10Þð1=2Þ
¼ 0:0909
EXAMPLE 1.7
(The Binary Symmetric Channel) A discrete channel is characterized by an input alphabet
X ¼ {x1, x2, . . ., xn}; an output alphabet Y ¼ {y1, y2, . . ., yn}; and a set of conditional probabilities (called
transition probabilities), Pij, which are defined as follows:
h i h i
Pij ¼ P y j jxi ¼ P receiving symbol y j ,given that symbol xi was transmitted
P11 = P [ y1 | x1]
x1 y1
P12 = P [ y2 | x1]
P21 = P [ y1 | x2]
x2 y2
P22 = P [ y2 | x2]
FIGURE 1.5
The Binary Channel
1.7 Conditional Probability 19
FIGURE 1.6
The Binary Symmetric Channel for Example 1.7
In the binary channel, an error occurs if y2 is received when x1 is transmitted or y1 is received when
x2 is transmitted. Thus, the probability of error, Pe, is given by:
If P12 ¼ P21, we say that the channel is a binary symmetrical channel (BSC). Also, if in the BSC
P[x1] ¼ p, then P[x2] ¼ 1 p ¼ q.
Consider the BSC shown in Figure 1.6, with P[x1] ¼ 0.6 and P[x2] ¼ 0.4. Compute the following:
Solution:
Let P[y1] denote the probability that y1 was received and P[y2] the probability that y2 was received.
Then
a. The probability that x1 was transmitted, given that y2 was received is given by
ð0:1Þð0:6Þ
¼
ð0:1Þð0:6Þ + ð0:9Þð0:4Þ
¼ 0:143
b. The probability that x2 was transmitted, given that y1 was received is given by
ð0:1Þð0:4Þ
¼
ð0:9Þð0:6Þ + ð0:1Þð0:4Þ
¼ 0:069
20 C HA PT E R 1 Basic Probability Concepts
c. The probability that x1 was transmitted, given that y1 was received is given by
P½x1 \ y 1 P½y 1 jx1 P½x1
P½x1 jy 1 ¼ ¼
P½y 1 P ½y 1 jx1 P½x1 + P½y 1 jx2 P½x2
ð0:9Þð0:6Þ
¼
ð0:9Þð0:6Þ + ð0:1Þð0:4Þ
¼ 0:931 ¼ 1 P½x2 jy 1
d. The probability that x2 was transmitted, given that y2 was received is given by
ð0:9Þð0:4Þ
¼
ð0:1Þð0:6Þ + ð0:9Þð0:4Þ
¼ 0:857 ¼ 1 P½x1 jy 2
Pe ¼ P½x1 P½y 2 jx1 + P½x2 P½y 1 jx2 ¼ P½x1 P12 + P½x2 P21 ¼ ð0:6Þð0:1Þ + ð0:4Þð0:1Þ
¼ 0:1
EXAMPLE 1.8
The quarterback for a certain football team has a good game with probability 0.6 and a bad
game with probability 0.4. When he has a good game, he throws an interception with a probability
of 0.2; and when he has a bad game, he throws an interception with a probability of 0.5. Given that he
threw an interception in a particular game, what is the probability that he had a good game?
Solution:
Let G denote the event that the quarterback has a good game and B the event that he
has a bad game. Similarly, let I denote the event that he throws an interception. Then we have that
P½G ¼ 0:6
P½B ¼ 0:4
P½IjG ¼ 0:2
P½IjB ¼ 0:5
P ½G \ I
P½GjI ¼
P½I
P ½G \ I P½IjGP½G
P½GjI ¼ ¼
P½I P½IjGP½G + P½IjBP½B
ð0:2Þð0:6Þ 0:12
¼ ¼
ð0:2Þð0:6Þ + ð0:5Þð0:4Þ 0:32
¼ 0:375
1.7 Conditional Probability 21
EXAMPLE 1.9
Two events A and B are such that P[A \ B] ¼ 0.15, P[A [ B] ¼ 0.65, and P[A|B] ¼ 0.5. Find P[B|A].
Solution:
P[A [ B] ¼ P[A] + P[B] P[A \ B] ) 0.65 ¼ P[A] + P[B] 0.15. This means that P[A] + P[B] ¼ 0.65
+ 0.15 ¼ 0.80. Also, P[A \ B] ¼ P[A|B]P[B]. This then means that
P½A \ B 0:15
P½B ¼ ¼ ¼ 0:30
P½AjB 0:50
Thus, P[A] ¼ 0.80 0.30 ¼ 0.50. Since P[A \ B] ¼ P[B|A]P[A], we have that
P½A \ B 0:15
P½BjA ¼ ¼ ¼ 0:30
P ½A 0:50
EXAMPLE 1.10
A student went to the post office to send a priority mail to his parents. He gave the postal lady a bill
he believed was $20. However, the postal lady gave him change based on her belief that she
received a $10 bill from the student. The student started to dispute the change. Both the student
and the postal lady are honest but may make mistakes. If the postal lady’s drawer contains thirty
$20 bills and twenty $10 bills, and the postal lady correctly identifies bills 90% of the time, what is
the probability that the student’s claim is valid?
Solution:
Let A denote the event that the student gave a $10 bill and B the event that the student gave a $20
bill. Let V denote the event that the student’s claim is valid. Finally, let L denote the event that the
postal lady said that the student gave her a $10 bill. Since there are 30 $20 bills and 20 $10 bills
in the drawer, the probability that the money the student gave the postal lady was a $20 bill is
30/(20 + 30) ¼ 0.6, and the probability that it was a $10 bill is 1 0.6 ¼ 0.4. Thus,
Therefore, the probability that the student’s claim is valid is the probability that he gave a $20 bill,
given that the postal lady said that the student gave her a $10 bill. Using Bayes’ formula we obtain
EXAMPLE 1.11
An aircraft maintenance company bought an equipment for detecting structural defects in air-
crafts. Tests indicate that 95% of the time the equipment detects defects when they actually exist,
and 1% of the time it gives a false alarm (that is, it indicates the presence of a structural defect
when in fact there is none). If 2% of the aircrafts actually have structural defects, what is the
22 C HA PT E R 1 Basic Probability Concepts
probability that an aircraft actually has a structural defect given that the equipment indicates that
it has a structural defect?
Solution:
Let D denote the event that an aircraft has a structural defect and B the event that the test indi-
cates that there is a structural defect. Then we are required to find P[D|B]. Using Bayes’ formula
we obtain
P½D \ B P½BjDP½D
P½DjB ¼ ¼
P½B P½BjDP½D + P BjD P D
ð0:95Þð0:02Þ
¼ ¼ 0:660
ð0:95Þð0:2Þ + ð0:01Þð0:98Þ
Thus, only 66% of the aircrafts that the equipment diagnoses as having structural defects actually
have structural defects.
p H HHH
H
p 1− p T HHT
H
p H HTH
1−p
T
p
1− p T HTT
Root
p H THH
1−p H
p
1− p T THT
T
p H TTH
1− p
T
1− p T TTT
FIGURE 1.7
Tree Diagram for Three Tosses of a Coin
Now, A [ B ¼ {HHH, HHT, HTH, HTT, TTH, TTT} and we have that
P ½A [ B ¼ P½HHH + P ½HHT + P½HTH + P½HTT + P½TTH + P½TTT
¼ p3 + 2p2 ð1 pÞ + 2pð1 pÞ2 + ð1 pÞ3
¼ p2 fp + 2ð1 pÞg + ð1 pÞ2 f2p + 1 pg ¼ 1 p + p2
¼ 1 pð1 pÞ
24 C HA PT E R 1 Basic Probability Concepts
EXAMPLE 1.12
A university has twice as many undergraduate students as graduate students. 25% of the graduate
students live on campus and 10% of the undergraduate students live on campus.
a. If a student is chosen at random from the student population, what is the probability that
the student is an undergraduate student living on campus?
b. If a student living on campus is chosen at random, what is the probability that the student
is a graduate student?
Solution:
We use the tree diagram to solve the problem. Since there are twice as many undergraduate stu-
dents as there are graduate students, the proportion of undergraduate students in the population
is 2/3, and the proportion of graduate students is 1/3. These as well as the other data are shown as
the labels on the branches of the tree in Figure 1.8. In the figure G denotes graduate student, U
denotes undergraduate student, ON denotes living on campus, and OFF denotes living off campus.
a. From the figure we see that the probability that a randomly selected student is an undergraduate
student living on campus is 0.067. We can also solve the problem directly as follows. We are
required to find the probability of choosing an undergraduate student who lives on campus, which
is P[U! ON], the probability of first going on the U branch and then to the ON branch from there.
That is,
2
P½U ! ON ¼ 0:10 ¼ 0:067
3
Student Population
0.75
OFF 0.250 Graduate off Campus
FIGURE 1.8
Figure for Example 1.12
1.7 Conditional Probability 25
b. From the tree, the probability that a student lives on campus is P[U ! ON] + P[G ! ON] ¼
0.067 + 0.083 ¼ 0.15. Thus, the probability that a randomly selected student living on cam-
pus is a graduate student is P[G ! ON]/{P[U ! ON] + P[G ! ON]} ¼ 0.083/0.15 ¼ 0.55. Note
that we can also use the Bayes’ theorem to solve the problem as follows:
P½ONjGP½G ð0:25Þð1=3Þ
P½GjON ¼ ¼
P½ONjUP½U + P½ONjGP½G ð0:25Þð1=3Þ + ð0:10Þð2=3Þ
5
¼ ¼ 0:55
9
EXAMPLE 1.13
A multiple-choice exam consists of 4 choices per question. On 75% of the questions, Pat thinks
she knows the answer; and on the other 25% of the questions, she just guesses at random. Unfor-
tunately even when she thinks she knows the answer, Pat is right only 80% of the time.
Solution:
We can use the tree diagram as follows. There are two branches at the root labeled K for
the event “Pat thinks she knows,” and K for the event “Pat does not know.” Under event K,
she is correct (C) with probability 0.80 and not correct C with probability 0.20. Under
event K, she is correct with probability 0.25 because she is equally likely to choose any of the
4 answer; therefore, she is not correct with probability 0.75. The tree diagram is shown in
Figure 1.9.
0.80 C Correct
K
0.75
0.20 C Wrong
0.25 C Correct
0.25
K
0.75
C Wrong
FIGURE 1.9
Figure for Example 1.13
26 C HA PT E R 1 Basic Probability Concepts
b. Given that she gets a question correct, the probability that it was a lucky guess is given by
ð0:25Þð0:25Þ
P½Lucky GuessjCorrect Answer ¼ P K ! C =P½Correct Answer ¼
0:6625
¼ 0:0943
...
P Ai \ Aj \ \ An ¼ P½Ai P Aj P½An
1.8 Independent Events 27
This is true for all 1 i < j < k < n. That is, these events are pairwise inde-
pendent, independent in triplets, and so on.
EXAMPLE 1.14
A red die and a blue die are rolled together. What is the probability that we obtain 4 on the red die
and 2 on the blue die?
Solution:
Let R denote the event “4 on the red die” and let B denote the event “2 on the blue die.” We are,
therefore, required to find P[R \ B]. Since the outcome of one die does not affect the outcome of the
other die, the events R and B are independent. Thus, since P[R] ¼ 1/6 and P[B] ¼ 1/6, we have that
P[R \ B] ¼ P[R]P[B] ¼ 1/36.
EXAMPLE 1.15
Two coins are tossed. Let A denote the event “at most one head on the two tosses” and let B denote
the event “one head and one tail in both tosses.” Are A and B independent events?
Solution:
The sample space of the experiment is Ω ¼ {HH, HT, TH, TT}. Now, the two events are defined as
follows: A ¼ {HT, TH, TT} and B ¼ {HT, TH}. Also, A \ B ¼ {HT, TH}. Thus,
3
P½A ¼
4
2 1
P½B ¼ ¼
4 2
2 1
P½A \ B ¼ ¼
4 2
3 1 3
P½AP½B ¼ ¼
4 2 8
Since P[A \ B] 6¼ P[A]P[B], we conclude that events A and B are not independent. Note that B is a
subset of A, which confirms that they cannot be independent.
Proposition 1.2
If A and B are independent events, then so are events A and B, events A and B, and events A and B:
Proof
Event A can be written as follows: A ¼ ðA \ BÞ [ A \ B : Since the events (A \ B) and A \ B are
mutually exclusive, we may write
P½A ¼ P½A \ B + P A \ B
¼ P½AP½B + P A \ B
Another random document with
no related content on Scribd:
PLATE CCCCLXXVI.
MELALEUCA DIOSMÆFOLIA.
Diosma-leaved Melaleuca.
CLASS XVIII. ORDER IV.
P O L YA D E L P H I A P O L YA N D R I A . T h r e a d s i n m a n y S e t s .
Many Chives.
ESSENTIAL GENERIC CHARACTER.
Calyx quinquefidus, semisuperus. Petala quinque. Filamenta multa,
longissima, in quinque corpora connata. Pistillum unum. Capsula 3-locularis.
Cup five-cleft, half above. Petals five. Threads numerous, very long,
united into five bodies. Pointal one. Capsule 3-celled.
See Melaleuca Ericæfolia, Pl. 175. Vol. III.
SPECIFIC CHARACTER.
Melaleuca foliis alternatis, ovatis, reflexis, subtus punctatis, odoratis:
floribus sessilibus in medio ramorum, viridibus, confertis: ramis verticillatis,
patentibus.
Melaleuca with alternate leaves, ovate, and reflexed, punctured beneath,
and sweet-scented. Flowers sessile about the middle of the branches, are of a
green colour, and crowded together. The branches are whorled, and
spreading.
REFERENCE TO THE PLATE.
1. A flower complete.
2. A flower spread open, without the empalement.
3. One of the five bundles of chives.
4. Empalement, seed-bud, and pointal, summit magnified.
5. A ripe seed-vessel.
This perfectly new species of Melaleuca was sent to us by Mr. J. Milne,
botanic gardener at Fonthill, who is very successful in the cultivation of new
plants. The punctured or dotted character on the under side of the leaves
gives it an affinity to the Diosma tribe, as does also its scented foliage,
which when rubbed emits a grateful aromatic odour; and which the leaves
retain in some degree when dried. The flowers, although not splendid, are
perhaps equally estimable from the rarity of their colour, which is a bright
green when in perfection; but in retiring they acquire a yellower tint. It is a
native of New Holland, and requires the careful treatment of the green-
house.
PLATE CCCCLXXVII.
LINUM VENUSTUM.
Graceful Linum.
CLASS V. ORDER V.
P E N TA N D R I A P E N TA G Y N I A . F i v e C h i v e s . F i v e P o i n t a l s .
ESSENTIAL GENERIC CHARACTER.
Calyx 5-phyllus. Petala 5-phylla. Capsule 5-valvis, 10-locularis. Semina
solitaria.
Empalement 5-leaved. Petals 5-leaved. Capsule 5-valved. 10
Loculaments. Seeds solitary.
SPECIFIC CHARACTER.
Linum foliis ovatis, acutis, 5-7-nervosis, margine pilosa: floribus in
umbellis paniculatis: ramis alternis: corollis magnis, patentibus, incarnatis.
Caulis erectus, pedalis.
Nascens in Monte Caucaso.
Linum with ovate sharp-pointed leaves. Nerves from 5 to 7, and hairy at
the edges. Flowers grow in paniculated umbels. Branches alternate. Blossom
large, spreading, and flesh-coloured. Branches upright, a foot high.
Native of Mount Caucasus.
This fine new Linum was raised from seed by Mr. J. Bell, in whose
garden near Brentford it has flowered for the first time in England. It is
nearest in affinity to the L. hirsutum of Jacquin, under which specific title
the seed was received by Mr. Bell. The flowers when dead or dried lose their
fine pinky tint, and acquire a blueish colour, the same as it first appears with
in the bud state. It might then compare with Jacquin’s figure in point of
colour, but would be too far removed in its appearance for us to have
adopted the specific of hirsutum with any propriety. We may therefore with
justice regard it as a beautiful nondescript species. It is a native of Mount
Caucasus, flowers in June and July, and seeds so freely that it will no doubt
be soon abundantly cultivated.
PLATE CCCCLXXVIII.
C R I N U M L AT I F O L I U M .
Broad-leaved Crinum.
CLASS VI. ORDER I.
FRAGARIA INDICA.
Indian Strawberry.
CLASS XXII. ORDER V.
I C O S A N D R I A P O L Y G Y N I A . Tw e n t y C h i v e s . M a n y
Pointals.
GENERIC CHARACTER.
VA C C I N I U M N I T I D U M .
Shining-leaved Whortle-berry.
CLASS VIII. ORDER I.
O C TA N D R I A M O N O G Y N I A . E i g h t C h i v e s . O n e P o i n t a l .
ESSENTIAL GENERIC CHARACTER.
Calyx superus. Corolla monopetala. Filamenta receptaculo inserta. Bacca
quadrilocularis, polysperma.
Cup superior. Blossom of one petal. Threads fixed to the receptacle. A
berry with four cells, and many seeds.
See Vol. I. Pl. XXX. Vaccinium Arctostaphyllus.
SPECIFIC CHARACTER.
Vaccinium foliis nitidis, ovatis, acutis, obsolete serratis: floribus
umbellatis, terminalibus, pendulis: corollis sub-cylindraceis. Stamina decem:
ramis oppositis, alternatis. Caulis pedalis, erectus.
Whortle-berry with shining leaves, egg-shaped, sharp-pointed, and
obscurely sawed. Flowers grow in umbels, terminal and pendulous: blossom
nearly cylindrical: chives ten: branches opposite, and alternate. Stem a foot
high, and upright.
REFERENCE TO THE PLATE.
1. The blossom spread open.
2. The chives spread.
3. A chive magnified.
4. Empalement, seed-bud, and pointal.
The Vaccinium nitidum is one of the handsomest species, but not so often to
be met with as many of the genus. It is nearly allied in its foliage to the V.
crassifolium; for, if leaves of both were detached, and mixt together, they
might be easily mistaken. The habits of the plants are, however, very
distinct, as is also the shape of the flowers. During the month of May and
beginning of June this plant is in the greatest perfection. After that period the
flowers lose much of their fine red colour. Our figure was made from a
beautiful little shrub, above a foot high, in the nursery of Messrs. Whitley
and Brames.
PLATE CCCCLXXXI.
CINCHONA CARIBÆA.
West India Bark-tree.
CLASS V. ORDER II.
P E N TA N D R I A M O N O G Y N I A . F i v e C h i v e s . O n e P o i n t a l .
GENERIC CHARACTER.
DIANTHUS ALPINUS.
Alpine Pink.
CLASS VI. ORDER II.
D E C A N D R I A D I G Y N I A . Te n C h i v e s . Tw o P o i n t a l s .
GENERIC CHARACTER.
Calyx. Perianthium cylindricum, tubulosum, striatum, persistens: os 5-
dentatum, basi squamulis quatuor cinctum, quarum interdum duæ oppositæ,
inferiores.
Corolla. Petala 5. Ungues longitudine calycis, angusti, receptaculo
inserti: limbus extus planus, laminis late obtusis, crenatis.
Stamina. Filamenta decem, subulata, longitudine calycis: antheræ ovales,
oblongæ, compressæ, incumbentes.
Pistillum. Germen ovale. Styli duo, subulati, staminibus longiores.
Stigmata recurvata, acuminata.
Pericarpium. Capsula cylindrica, recta, unilocularis, quadrilateralis,
apice dehiscens.
Semina plurima, compressa, subrotunda, a receptaculo liberata.
Empalement cylindrical, tubular, lined, remaining: the mouth is 5-
toothed, surrounded by four squamæ at the base, or sometimes two opposite
ones beneath.
Blossom 5 petals: the claws the length of the calyx, narrow, and inserted
into the receptacle: border flat without, broadly obtuse, and notched.
Chives. Ten threads, awl-shaped, the length of the calyx: tips oval,
oblong, compressed, and incumbent.
Pointal. Seed-bud oval. Shafts two, awl-shaped, and longer than the
chives. Summit recurved, and pointed.
Seed-vessel. Capsule cylindrical, straight, one loculament, four-sided,
and splitting at the end.
Seeds many, compressed, nearly round, and freed from the receptacle.
SPECIFIC CHARACTER.
Dianthus alpinus. Frutex pygmæus, elegans, foliis oppositis, alternis,
linearibus, curvatis, brevibus: floribus terminalibus, rubris, in medio circulo
albo.
Alpine pink. An elegant dwarf shrub, with opposite alternate leaves,
linear, curved, and short. Flowers terminal and red, with a small circle of
white in the centre.
REFERENCE TO THE PLATE.
1. The empalement.
2. A petal shown from the under side.
3. Seed-bud, chives, and pointals.
4. The chives spread open.
5. Seed-bud and pointals.
6. The seed-bud cut transversely.
The Dianthus alpinus is a very scarce plant, little known, and rarely to be
met with in any collection, although its beauty renders it deserving a place in
every one, and its size would never exclude it from any. Our figure
represents the entire plant, from the collection of Isaac Swainson, esq. who
raised it from seed which he received from Germany. The only coloured
representation of it extant is in the Flora Austriaca of Jacquin, from a native
specimen, and of no greater magnitude than our figure represents—a
diminutive stature, particularly characteristic of the true Alpine pink.
PLATE CCCCLXXXIII.
D A H L I A P I N N ATA N A N A .
Dwarf Winged-leaved Dahlia.
CLASS XIX. ORDER II.
S Y N G E N E S I A P O LY G A M I A S U P E R F L U A . Ti p s u n i t e d .
Superfluous Pointals.
ESSENTIAL GENERIC CHARACTER.
Calyx duplex. Corolla radiata, radiis lacinias calycis numero æquantibus:
corollulæ pedicellatæ. Receptaculum paleaceum. Stigmata plumosa.
Empalement double. Blossom radiated, with the rays equalling in number
the segments of the empalement: the florets pedicelled. Receptacle chaffy.
Summit plumose.
SPECIFIC CHARACTER.
Dahlia foliis pinnatis: pinnulis quinque, ovatis, acutis, dentatis: floribus
duplicibus: caulis humilis.
Dahlia with pinnated leaves: the pinnules five, ovate, pointed, and
toothed: flowers double: stem low.
REFERENCE TO THE PLATE.
1. One of the radiating florets.
2. A floret of the disk with its scale.
3. A flower spread open.
4. Seed-bud and pointal, summit magnified.
This double-flowered dwarf Dahlia is certainly the most attractive of the
genus. It is supposed to be only a variety of the D. pinnata, but the variation
is almost powerful enough to constitute a species; as, besides the difference
in its flowers, we have never found it arrive to more than half the height of
the pinnata, although we have seen it every autumn for four years in
luxuriant bloom. At present it is a scarce plant, and appears to be not quite so
hardy as the taller species, nor so easily increased. Our figure was made
from a plant in the collection of the Right Hon. Lady Holland, at Holland
House, Kensington.