Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
139 views

Class 12 Chapter 13 Maths Important Formulas

1. The document discusses key concepts in probability such as conditional probability, multiplication theorem, independent events, random variables, Bayes' theorem, Bernoulli trials, mean, variance, and probability functions. 2. It also covers connecting concepts like partition of sample space, total probability theorem, hypotheses in Bayes' theorem, probability distributions of random variables, and binomial distribution. 3. The key concepts and connecting concepts are fundamental to understanding probability, random variables, and common probability distributions like the binomial distribution.

Uploaded by

Hari om
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
139 views

Class 12 Chapter 13 Maths Important Formulas

1. The document discusses key concepts in probability such as conditional probability, multiplication theorem, independent events, random variables, Bayes' theorem, Bernoulli trials, mean, variance, and probability functions. 2. It also covers connecting concepts like partition of sample space, total probability theorem, hypotheses in Bayes' theorem, probability distributions of random variables, and binomial distribution. 3. The key concepts and connecting concepts are fundamental to understanding probability, random variables, and common probability distributions like the binomial distribution.

Uploaded by

Hari om
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

13 PROBABILITY

i
rth
KEY CONCEPT INVOLVED
1. Conditional Probability – Let E and F be two events of a random experiment, then, the probability of
occurance of E under the condition that F has alredy occured and P (F)  0 is called the conditional
probability. It is denoted by P (E/F)
P (E  F)
The conditional probability P (E/F) is given by P (E/F) = , When P (F)  0
P (F)
Properties of conditional probability –
(i) If F be an event of a sample space s of an experiment, then P (S/F) = P (F/F) = 1
If A and B are any two events of a sample space S and F is an event of s such that P (F) 0, then
ya
(ii) P (A  B/F) = P(A/F) + P (B/F) – P (A  B/F)
IF A and B are disjoint event then P(A  B/F) = P(A/F) + P (B/F)
(iii) P (E / F) = 1 – P (E/F) or P (E/F) = 1 – P (E/F)
2. Multiplication Theorem On Probability – Let E and F be two events associated with a sample
space S. P(E  F) denotes the probability of the event that both E and F occur, which is given by
P (E  F) = P (E) P (F/E) = P (F) P (E/F), provided P (E)  0 and P(F)  0
3. Independent Event –
(i) Events E and F are independent if P (E  F) = P (E) × P (F)
(ii) Two events E and F are said to be independent if P (E/F) = P (E) and P (F/E) = P (F)
provided P (E)  0 and P (F)  0
id
(iii) Three events E, F and G are said to be independent or mutually independent if
P (E  F  G) = P (E) P (F) P (G).
4. Random Variable – A random variable is a real valued function whose domain is the sample space of
random experiment.
5. Baye’s Theorem – let E1, E2, En be the x events forming a partition of sample space S i.e. E1, E2,
En are pairwise disjoint and E1  E2  En = S and A is any event of non – zero
P (E i ) P (A / E i )
porbability, then P (Ei/A) = n for any i = 1, 2, 3, ......., n
eV

 P (E ) P (A / E )
j 1
j j

6. Bernoulli Trial – Trials of a random experiment are said to be Bernoulli’s trials, if they satisfy the following
conditions :
(i) The trials should be independent.
(ii) Each trial has exactly two outcomes ex- success or falilure.
(iii) The probability of success remains the same in each trial.
(iv) Number of trials is finite.
7. Mean of Random Variable – let X be a random variable whose possible values are x1, x2, xn if
P1, P2, Pn are the corresponding probabilities, then mean of X,

https://www.evidyarthi.in/
n
=  x p = E (X)
i 1
i i

The mean of a random variables X is also called the expected value of X denoted by E (x).

i
8. Variance of a Random Variable – let X be a random variable with possible values x1 x2,  xn occur with
probabilities are p1, p2 pn respectively.

rth
let  = E (X) be the mean of X. The variance of X denoted by var (X) or  2x is defined as
n

Var (X) or  2x = 
i 1
(xi - )2 pi = E (xi – )2 = E (X2) – [E (X)]2

Standard Deviation, x = Var (X)


9. Probability function – The probability of x success is denoted by p (X = x) or P(x) and is given by P (x) =
nC qn – x px , x = 0, 1, 2, n and q = 1 – P
x
The function P (x) is known as probability function of binomial distribution.

CONNECTING CONCEPTS
1.
if
ya
Partition of a sample space – A set of events E1, E2 ,  En is said to represent a partition of sample S

(i) Ei  Fj =  if i  j , i, j = 1, 2,n
(ii) E1  E2  E3   En = S
(iii) P (Ei) > 0  i = 1,2, n.
2. Theorem of total Probability – let  E1, E2, En  be a partition of sample spaces and each event has
a non – zero probability If A be any event associated with S, then
P (A) = P (E1) P (A/E1) + P (E2) P (A/E2) + P (E3) P (A/E3) +  + P (En) P (A/En)
n
P (A) =  P (E ) P (A/E )
i i
id
i 1

3. A Few Terminologies –
(i) Hypothesis – When Baye’s theorem is applied the events E1, E2, En are said to be hypothesis x.
(ii) Priori Porbability – The Porbabilites P (E1), P (E2) P (En) are called priori.
(iii) Posteriori Porbabililty – The conditional probability P (Ei/A) is known as the posteriori probability
of hypothesis Ei where i = 1, 2, 3, ......, n
4. Probability Distribution of a Random Variable – let real numbers x1, x2, xn be the possible value
eV

of random variable and p1, p2,pn be probability corresponding to each value of the random
variable X. Then the probability distribution is
X: x1 x2  xn
P(X) : p1 p 2  pn.
(i) pi > 0 (ii) sum of porbabilites p1 + p2 +  + pn = 1.
5. Binomial Distribution – Probability distribution of a number of successes, in an experiment consisting of
n Bernoulli trials are obtanied by Binomial expansioin of (q + p)n. Such a probability distribution is
X: 0 1 2  r  n
P(X) : nC0 qn nC1 q n – 1 P n C2 q n – 2 P2 nCr q P n – r r nC Pn
n
This probability distribution is called binomial distribution with parameter n and p.
Where, p is the probability of success in each trial and q is the probability of not sucess in each trial.
 p+q=1,q=1–p

https://www.evidyarthi.in/

You might also like