Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Probability

Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

How to Prepare Probability for SSC CGL

Tier II - Study Notes in PDF


The SSC will be conducting the SSC CGL Tier II Exam soon! The SSC CGL Prelims Exam
was conducted from 5th August to 24th August 2017. This year the exam pattern for
Prelims Exam also underwent some major changes. There are a total of 4733 vacancies
to be filled this year. If you are confident that you will make it through to the SSC CGL
Tier II, then you can read the article given below. This article will help on How to
Prepare Probability for SSC CGL Tier II. In the following article, you will know in
detail about Probability theory, Conditional Probability, Bayes' Theorem etc. You can
also take our SSC CGL Online Mock Tests to boost up your preparation strategy.

Probability for SSC - Probability Theory

The term probability is a quantitative measure of uncertainty. It quantifies the


concept of chance or likelihood. The theory of probability provides the random
phenomena to measure the chance of possible outcomes for which the outcome is
uncertain i.e. whether a particular event will occur or not. For an event that will occur,
its probability is 100%. For an event that will not occur, its probability is 0.

For e.g. if the probability of any particular event is 1/4, then it indicates that there is
25% chance that an event will occur and 75% chance that an event will not occur.

Any operation whose outcome is well defined is called Experiment and whose
outcome is not well defined is called Random Experiment but in the random
experiment all the possible outcomes are known in advance but the exact outcome
cannot be predicted in advance. The list of possible outcomes of a random experiment
must be exhaustive and mutually exclusive.

1|Page
 For e.g. in the role of a die, there are 6 outcomes, and the probability of each
outcome is 1/6
 For e.g. in the toss of two coins, there are 4 outcomes, and the probability of
each outcome is 1/4

Few important terms associated with probability are:-

 Sample Space (S): In a random event, the set of all the possible number of
outcomes is known as sample space
 Event (E): In a random event, the set of a favorable number of outcomes is
known as an event. Event E is a subset of sample space S
o Simple event: It is the event which has only one outcome. For e.g. the
event of getting head while tossing a coin
o Compound event: It is the event which has more than one outcome.
For e.g. the event of getting odd or even while rolling a die
o Mutually exclusive event: The random experiment that results in the
occurrence of only one of the n outcomes. It means that occurrence of one
event excludes the occurrence of the other. g. if a coin is tossed, the result
is a head or a tail, but not both. That is, the outcomes are defined so as to
be mutually exclusive.

Let A and B are two mutually exclusive events, then

2|Page
 If A and B are two mutually exclusive and exhaustive events then,
o P (A) + P (B) = 1
 Equally likely event: Each outcome of the random experiment has an equal
chance of occurring
 Let A be any event and be its complimentary event, then P (A) + P () = 1

 Union of event (A ∪ B): It occurs if the event A occurs or the event B occurs
i.e. it consists of all the outcomes that are either in A or in B or in both events.

3|Page
 Intersection of event (A B): It occurs if the event A occurs and the event B
occurs i.e. it consists of all the outcomes that are both in A and B

Conditional Probability: -

When we have extra information, then how the probability of an event does change is
being defined by the conditional probability.

 For e.g. if we toss a coin 3 times, then the probability of 3 heads = 1/8 as the
sample space is { HHH, HHT, HTH, HTT, THH, THT, TTH, TTT}

This is called conditional probability, as it takes into account additional conditions. It is


denoted as P (A|B) and is read as ‘the conditional probability of A given B’. It can also be
written as,

Where, P(A ∩ B) = probability of both A and B occurring

 For e.g. given a deck of 52 cards, you drew a black card, what’s the probability
that it’s a four. So out of the 52 cards, 26 black cards are there. Given a black
card, there are two fours. Therefore, P (four | black) = 2/26 = 1/13

4|Page
Independent Event: Two events are said to be independent if the occurrence of either
one of the two events does not affect the occurrence of the other event. Two events are
said to be independent if

 P (A|B) = P (A)
 P (B|A) = P (B)
 p (A ∩ B) = P (A). P(B)

Compound Probability:

It is equal to the probability of the first event multiplied by the probability of the second
event. It is the joint occurrence of two or more simple events.

 When X and Y are two independent events. Then,


o P (X and Y) = P (X) * P (Y)
 When the events are dependent then,
o P (X and Y) = P (X) * P(Y following X)

These are the cases when both parts of the compound events are true. But, when one or
more part of the compound event holds true, then the compound probability is given by

P (X or Y) = P (X) + P (Y)

Bayes’ Theorem

It describes the probability of an event, based on the prior knowledge of conditions


related to the event. It is a direct application of conditional probabilities i.e. how to find
P (B|A) from P (A|B)

5|Page
Where,

 A and B are events and P (A) ≠ 0


 P (A) and P (B) are the probabilities of observing A and B without regard to each
other
 P (B|A) a conditional probability of observing event A given that B is true
 P (A|B) a probability of observing event B given that A is true.

It also exists in different form i.e.

Where,

 is an event complementary to B.

Probability for SSC - Random Variable and Probability Distributions

A Random Variable takes defined set of values with different probabilities. It takes real
values in accordance with the change in the outcome of the random experiment. For
e.g. if you roll a die, the outcome is random. Random variable can be Discrete or
Continuous.

 Discrete Random Variable has a finite or countable infinite number of values.


For e.g. Dead / Alive
 Continuous Random Variable has non countable infinite possible values. For
e.g. Blood Pressure

Probability Distributions Functions:

6|Page
It describes how probabilities are distributed over the values of the random variable. It
also maps the possible values of x against their respective probabilities of occurrence, p
(x). It range from 0 to 1

 Discrete Probability Distributions:

Let X be the discrete Random Variable. Then, the probability mass function (pmf), f(x),
of X is :

 Continuous Probability Distributions:

Let X be the continuous Random Variable. Then, the probability density function (pdf),
f(x), of X for any two numbers a and b with a ≤ b is :

7|Page
All the probability distributions are characterized by an expected value (mean) and a
variance (standard deviation squared)

Expected Value: It is just the average or mean (µ) of random variable X. It is


sometimes called weighted average. It is extremely useful concept for good decision
making

 Discrete Expected Value: Let X be a discrete random variable that takes on


values in the set D and has a pmf f(x). Then, the expected or mean value of X is:

 Continuous Expected Value: The expected or mean value of a continuous


random variable X with pdf f(x) is:

Variance:

8|Page
 Discrete: Let X be a discrete random variable with pmf f(x) and expected value
µ. Then, the variance of X is:

 Continuous: The variance of a continuous random variable X with pdf f(x) and
mean is µ:

Probability Higher moments of a Random Variable:

The Moments of a random variable are expected values of powers or related functions
of the random variable. Expected value is called the first moment of a random variable.
Variance is called the second central moment or second moment about the mean of a
random variable

 The mean is a measure of the “center” or “location” of a distribution.

Let,

 n = number of identical trials

9|Page
 P (S), P (F) = two outcomes i.e. success or failure
o P (S) = p; P (F) = q = 1 – p
 Trials are independent
 x is the number of success in ‘n’ trials

1.pmf of Binomial Distribution =

2. cdf of Binomial Distribution =

 For e.g. if in a class 40% of the students are male, what is the probability that 6
of the first 10 students walking in will be male?

10 | P a g e
Poisson distribution

It occurs when there are events which don’t occur as outcome of a definite number of
trails of an experiment but which occur at random point of time and space. It evaluates
the probability of a number of occurrences out of many opportunities.

 For e.g. number of deaths from diseases such as heart attack or cancer or due to
snake bite
 For e.g. number of suicide reported on a particular day

This distribution is used in situations where ‘events’ happen at certain points in time. It
approximates the binomial distribution.

11 | P a g e
Normal Distribution

It is a relative frequency distribution of errors. It is symmetrical about its mean. A


continuous random variable X is said to have a normal distribution with
parameters and σ > 0, if the pdf of X is

The Normal distribution with parameter values μ = 0 and σ = 1 is called the standard
normal distribution and is denoted by Z

Exponential distributions

12 | P a g e
X is said to have an exponential distribution with the rate parameter if the pdf
of X is

Joint distribution of two random variables

If X and Y are two random variables, the probability distribution that defines their
simultaneous behavior is called a joint probability distribution

Let X and Y be two discrete random variables, and let S denote the two-dimensional
support of X and Y. Then, the function f(x, y) = P(X = x, Y = y) is a joint probability mass
function (abbreviated p.m.f.) if it satisfies the following three conditions:

In the case of only two random


variables, this is called a bivariate distribution and for any number of random
variables, this is called a multivariate distribution.

Two discrete random variables X and Y are independent if the joint probability mass
function satisfies

P(X = x and Y = y) = P(X = x). P(Y=y)} for all x and y.

13 | P a g e
Now that you know the details about SSC CGL Tier II Statistics, you can click on the link
given below and start your preparation for the upcoming exams.

SSC CGL Recruitment 2017

Detailed SSC CGL Tier II Syllabus

How to Prepare Statistics for SSC CGL Tier II

14 | P a g e

You might also like