Matematik Tambahan Tugasan Suhaimi
Matematik Tambahan Tugasan Suhaimi
Matematik Tambahan Tugasan Suhaimi
No Contents Page
1 Introduction 3
2 Part 1 4
3 Part 2 5
4 Part 3 7
5 Part 4 8
6 Part 5 11
7 Conclusion 19
INTRODUCTION
Probability is a way of expressing knowledge or belief that an event will occur or has occurred.
In mathematics the concept has been given an exact meaning in probability theory, that is used
extensively in such areas of study as mathematics, statistics, finance, gambling, science, and
philosophy to draw conclusions about the likelihood of potential events and the underlying
mechanics of complex systems.
Interpretations
The word probability does not have a consistent direct definition. In fact, there are sixteen
broad categories of probability interpretations, whose adherents possess different (and
sometimes conflicting) views about the fundamental nature of probability:
1. Frequentists talk about probabilities only when dealing with experiments that are
random and well-defined. The probability of a random event denotes the relative
frequency of occurrence of an experiment's outcome, when repeating the experiment.
Frequentists consider probability to be the relative frequency "in the long run" of
outcomes.[1]
2. Bayesians, however, assign probabilities to any statement whatsoever, even when no
random process is involved. Probability, for a Bayesian, is a way to represent an
individual's degree of belief in a statement, or an objective degree of rational belief,
given the evidence.
Part 1
a) The history of probability theory
The branch of probability was further expanded upon by two mathematicians, namely
Pierre de Fermat and Blaise Pascal in the seventeenth century. They conducted research
on the problem of points, which is a mathematical problem concerning probability.
Blaise Pascal and Pierre de Fermat (Respectively). They built upon Gerolamo Cardano’s
work.
In 1657, the field of probability was strengthened by Christiaan Huygens, who, under
encouragement from Pascal, wrote a book on the subject, called De ratiociniis in ludo
aleae ("On Reasoning in Games of Chance"), which he had published in 1657. This is
regarded as the first book on probability theory, since it was published before Cardano’s
book.
Later, the branch of probability evolved into modern probability theory with the help of
Andrey Nikolaevich Kolmogorov, who laid the foundations of modern probability. He
combined the sample space with the measure theory, creating his axiom system in
1933. This became the standardized form of probability measurement, continuing until
now.
In our daily life, The Theory of Probability is used in various occurrences. One real life
example is when we are playing cards, for example Blackjack. There are 52 cards in a
deck. Each player receives two cards. The best pair of cards that one can receive is an
Ace and one of the picture cards. Speaking in terms of probability, if there are three
players, counting the banker, each player has a 6/169 chance of getting 21 points in the
first deal.
Alternatively, the Theory of Probability is used for business. Probability is used in
business to evaluate the risk in a decision. The higher probability of success ensures a
lower chance of failing.
Probability can help us understand the seemingly random forces and ensure a risk free
environment. Thus it is important for us to study probability.
Theoretical probability is the branch of probability concerned with the theory. There is
no concrete proof and all results are based only on calculation.
Empirical probability, as its name suggests, is based on experiments and the results
Mathematical treatment
The opposite or complement of an event A is the event [not A] (that is, the event of A not
occurring); its probability is given by P(not A) = 1 - P(A).[7] As an example, the chance of not
rolling a six on a six-sided die is 1 – (chance of rolling a six) . See Complementary
event for a more complete treatment.
If both the events A and B occur on a single performance of an experiment this is called the
intersection or joint probability of A and B, denoted as . If two events, A and B are
independent then the joint probability is
[8]
for example, if two coins are flipped the chance of both being heads is
If either event A or event B or both events occur on a single performance of an experiment this
is called the union of the events A and B denoted as . If two events are mutually
exclusive then the probability of either occurring is
For example, when drawing a single card at random from a regular deck of cards, the chance of
getting a heart or a face card (J,Q,K) (or one that is both) is , because of
the 52 cards of a deck 13 are hearts, 12 are face cards, and 3 are both: here the possibilities
included in the "3 that are both" are included in each of the "13 hearts" and the "12 face cards"
but should only be counted once.
Conditional probability is the probability of some event A, given the occurrence of some other
event B. Conditional probability is written P(A|B), and is read "the probability of A, given B". It is
defined by
[9]
Summary of probabilities
Event Probability
A
not A
A or B
A and B
A given B
Theory
Probability theory
There have been at least two successful attempts to formalize probability, namely the
Kolmogorov formulation and the Cox formulation. In Kolmogorov's formulation (see probability
space), sets are interpreted as events and probability itself as a measure on a class of sets. In
Cox's theorem, probability is taken as a primitive (that is, not further analyzed) and the
emphasis is on constructing a consistent assignment of probability values to propositions. In
both cases, the laws of probability are the same, except for technical details.
There are other methods for quantifying uncertainty, such as the Dempster-Shafer theory or
possibility theory, but those are essentially different and not compatible with the laws of
probability as they are usually understood.
Empirical probability
It also known as relative frequency, or experimental probability, is the ratio of the number
favorable outcomes to the total number of trials,[1][2] not in a sample space but in an actual
sequence of experiments. In a more general sense, empirical probability estimates probabilities
from experience and observation.[3] The phrase a posteriori probability has also been used as
an alternative to empirical probability or relative frequency. [4] This unusual usage of the phrase
is not directly related to Bayesian inference and not to be confused with its equally occasional
use to refer to posterior probability, which is something else.
Two major applications of probability theory in everyday life are in risk assessment and in trade
on commodity markets. Governments typically apply probabilistic methods in environmental
regulation where it is called "pathway analysis", often measuring well-being using methods that
are stochastic in nature, and choosing projects to undertake based on statistical analyses of
their probable effect on the population as a whole.
A good example is the effect of the perceived probability of any widespread Middle East conflict
on oil prices - which have ripple effects in the economy as a whole. An assessment by a
commodity trader that a war is more likely vs. less likely sends prices up or down, and signals
other traders of that opinion. Accordingly, the probabilities are not assessed independently nor
necessarily very rationally. The theory of behavioral finance emerged to describe the effect of
such groupthink on pricing, on policy, and on peace and conflict.
It can reasonably be said that the discovery of rigorous methods to assess and combine
probability assessments has had a profound effect on modern society. Accordingly, it may be of
some importance to most citizens to understand how odds and probability assessments are
made, and how they contribute to reputations and to decisions, especially in a democracy.
Part 2
a) List of all the possible outcomes when the die is tossed once
= {1,2,3,4,5,6}
b) List of all the possible outcomes when two dice are tossed simultaneously:
1 (1,1)
1 (2,1)
2 (1,2)
2 (2,2)
3 (1,3)
1 3 (2,3)
4 (1,4) 2
4 (2,4)
5 (1,5) 5 (2,5)
6 (1,6) 6 (2,6)
1 (3,1) 1 (4,1)
2 (3,2) 2 (4,2)
3 (3,3) 3 (4,3)
3 4
4 (3,4) 4 (4,4)
5 (3,5)
5 (4,5)
6 (3,6)
6 (4,6)
1 (6,1)
1 (5,1)
2 (6,2)
2 (5,2)
3 (5,3) 3 (6,3)
5 6
4 (5,4) 4 (6,4)
5 (5,5) 5 (6,5)
6 (5,6) 6 (6,6)
Part 3
Table 1 shows the sum of all dots on both turned-up when two dice are tossed simultaneously.
A) Complete Table 1 by listing all possible outcomes and their corresponding probabilities.
2 (1,1) 1/36
3 (1,2),(2,1) 1/18
4 (1,3),(2,2),(3,1) 1/12
5 (1,4),(2,3),(3,2),(4,1) 1/9
6 (1,5),(2,4),(3,3),(4,2),(5,1) 5/36
7 (1,6),(2,5),(3,4),(4,3), 1/6
(5,2),
(6,1)
8 (6,2),(3,5),(4,4),(5,3),(6,3) 5/36
9 (3,6),(4,5),(5,4),(6,3) 1/9
10 (4,6),(5,5),(6,4) 1/12
11 (5,6),(6,5) 1/18
12 (6,6) 1/36
Table 1
B) Based on table 1 that have I completed. All the possible outcomes of the following event
and their corresponding probabilities..
A =(1,2),(1,3),(1,4),(1,5),(1,6),(2,1),(2,3),(2,4),(2,5),(2,6),(3,1),(3,2,),(3,4),(3,5),(3,6),
(4,1),(4,2),(4,3),(4,5),(4,6),(5,1),(5,2),(5,3),(5,4),(5,6),(6,1),(6,2),(6,3),(6,4),(6,5)
B={ }
C = { (2,2),(2,3),(2,5),(3,2),(3,5),(3,3),((5,2),(5,3),(5,5),(1,2),(1,4),(1,6),(2,1),(3,4),
(3,6),(4,1),(4,2),(4,3),(4,5),(5,4),(5,6),(6,1),(6,3),(6,5)
D = {(2,2),(3,3),(3,5),(5,3),(5,5)}
5
P(A)= 6
P(B)=0
23
P(C)= 36
5
P(D)= 36
Part 4
FX X2 Fx2
Sum of two numbers Frequency
(x)
2 4 4
2
1
9 9 27
3
3
12 16 48
4
3
35 25 175
5
7
24 36 144
6
4
56 49 392
7
8
56 64 448
8
7
72 81 648
9
8
20 100 200
10
2
55 121 605
11
5
24 141 288
12
2
∑ fx =1365 ∑ ❑ fx 2=2979
∑ ❑ f =50
❑
Table 2
2979
(2) Variance = - 7.32 = 6.29
50
(B)Predict the value of the mean if the number of tosses is increased to 100 times. =The
value of the tossed will be increase.
(c)Test your prediction in 9b) by continuing Activity 3(a) until the number of tosses is 100
times.Then, determine the value of:
VARIANCE =
∑ ❑ FX 2 - X2
∑ ❑F
SUM OF THE Frequency (x) F (x) X2 F(x2)
TWO
NUMBERS(X)
2 2 4 4 8
3 6 18 9 54
4 8 32 16 128
5 13 65 25 325
6 13 78 36 468
7 16 112 49 784
8 14 112 64 894
9 12 108 81 972
11 4 44 121 984
12 2 24 144 288
∑ fx 2=5407
∑ f =100 ∑ fx =697
679
Mean = = 6.79
100
5470
Variance = -6.79
100
Alternative metho
~ The tossing of dice may be simulated using the graphing calculator application programme.
Part 5
(A)
B)Compare the mean ,variance and the standard deviation obtain in part 4 and part 5.What can
you say about the values? Explain in your own words your interpretation and your
understanding of the values that you have obtained and related your answer to the Theorical
and Emperical Probabilities.
m 7.3 6.97 7
(c)If n is the number of times two dice are tossed simultaneously ,what is the range of the sum
of all dots on the turned-up faces as n changes? Make your conjecture and support your
conjecture.
n mean
50 7.3
100 6.79
As n changes the range of mean of the sum of the range of mean is between 6 and 8.
SUPPORTING CONJECTUR
0.16
0.14
Series 1
0.12
0.1
0.08
Series 1
0.06
0.04
0.02
0
2 3 4 5 6 7
~From the probability distribution graph, it shows that the range of mean is between 6 and 8.
FURTHER EXPLORATION
The law of large numbers (LLN) states that the sample average of
(independent and identically distributed random variables with finite expectation μ) converges
towards the theoretical expectation μ.
It is in the different forms of convergence of random variables that separates the weak and the
strong law of large numbers
It follows from LLN that if an event of probability p is observed repeatedly during independent
experiments, the ratio of the observed frequency of that event to the total number of repetitions
converges towards p.
Putting this in terms of random variables and LLN we have are independent Bernoulli
random variables taking values 1 with probability p and 0 with probability 1-p. E(Yi) = p for all i
REFLECTION