Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Unit 2 Ma 202

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 13

Lesson Plan/Lecture Handouts

Department of Mathematics

Course code: MA202, Course Name: Probability & Statistics

Course Objectives:- Objectives of the course includes to familiarize the

students with the concept of probability and statistical techniques. To develop

such skills which help them to solve the engineering problems using the tools

of probability and Statistics.

UNIT II: RANDOM VARIABLES AND PROBABILITY DISTRIBUTIONS

Discrete & continuous random variables and their properties, mass function, density function,

distribution functions. Expectation, moment generating function, Binomial, Poisson, Exponential

& Normal distributions and their applications.

RANDOM VARIABLES

Intuitively by a random variable (r.v) we mean a real number X connected with the outcome of a

random experiment E. For example, if E consists of two tosses of a coin, we may consider the

random variable, which is the number of heads ( 0, 1 or 2).

Outcome: HH HT TH TT

Value of X: 2 1 1 0

Thus to each outcome a, there corresponds a real number X (a).

.
 A quantitative variable x is a random variable if the value that it assumes;

corresponding to the outcome of an experiment is a chance or random event.

 Random variables can be discrete or continuous.

Examples:

 x = Test score for a randomly selected student

 x = number of people in a room at a randomly selected time of day

 x = number on the upper face of a randomly tossed die

DISCRETE RANDOM VARIABLES:

If a random, variable takes at most a countable number of values, it is called a discrete random

variable. In other 'words, a real valued function defined on a discrete sample space is called a

discrete random variable.

PROBABILITY MASS FUNCTION:

Suppose X is a discrete random variable taking at most a countable infinite number of values

x 1, x 2 ...., With each possible outcome Xi , , we associate a number pi=P( X=x i )= p( x i ),

called the probability of Xi. The numbers p (Xi); i:; 1,2,.,.. must satisfy the following conditions:

i) p( x i )≥0 ∀ i
(


∑ p( x i )=1
i=1
(ii)
This function p 'is called the probability mass function of random variable X and the set

{ x i , p (x i )}
is called the probability distribution (p.d.) of the R.V. X.

Example:

A 'random variable X has the following probability distribution:

x: 0 1 2 3 4 5· 6 7

p (x) : 0 k 2k 2k 3k k2 2k2 7 k2 + k

(i) Find k

P( X , 6), P( X ≥6)
(ii) Evaluate

7
∑ p( x i )=1
i =1
Solution: Since we have

⇒ k +2 k +2 k +3 k +k 2 +2 k 2 +7 k 2 +k = 1

⇒10 k 2 +9 k − 1=0

1
⇒ k= and k =−1( Not possible ,)
10

Since probability cannot be negative.

(ii).

P( X <6 )=P( X=0 )+P( X=1)+ P( X =2)+ P( X =3)+ P( X =4 )+ P( X=5 )

1 2 2 3 1 81
⇒ + + + + =
10 10 10 10 100 100
81 19
P( X≥6) = 1− =
100 100
and

Example:

CONTINUOUS RANDOM VARIABLE:

A random variable X is said to be continuous if, it can take all possible values between

certain limits. In other words a random variable is said to be continuous when its different values

cannot be put in 1-1 correspondence with a set of positive integers.

A continuous random variable;: is a random variable that (at least conceptually) can be

measured to any desired degree of accuracy. Examples of continuous random variables are age,

height, weight etc.

PROBABILITY DENSITY FUNCTION: Let f(x) be any continuous function then, The

probability density function (p.d.f.) of a random variable (r. v.) X usually denoted by f (x) has the

following properties:
(i) f ( x )≥0 , −∞<x <∞


(iI) ∫−∞ f (x) dx=1

(iii) P( E)
The probability given by

P( E)=∫ f ( x ) dx =1

Example: A continuous random variable X has a p.d.f.

2
f (x )=3 x ; 0≤x≤1

Find a and b such that:

(i) P ( X≤a)=P ( X >a )

(ii) P( X >b)=0 .05


1 1
1 19
Hence
a= ()
2
3

and
b=( )
20
3

BINOMIAL DISTRIBUTION. Binomial distribution was discovered by James Bernoulli

(1654-1705) in the year 1700 and was first published posthumously in 1713. eight years after

his death). Let a random experiment be performed repeatedly and let the occurrence of an event

in a trial be called a success and its non-occurrence a failure.

Consider a set of n independent Bernoullian trials, in which the probability 'p' of success

in any trial is constant for each trial. Then q = 1 - p, is the probability of failure in any trial.

The probability of x successes and consequently (n -x) failures in n independent trials, in a

specified order (say) SSFSFFFS .. .FSF (where S represents success and F failure) is given by

the compound probability theorem-by the expression:

n
But x successes in n trials can occur in C x ways and the probability for each of these ways is

x n−x
pq .Hence the probability of' x successes in n trials are given by expression:

n
C x p x qn−x
Definition: A random variable X is said to follow binomial distribution if it assumes only non-

negative values and its Probability mass function is given by

The two independent constants n and p in the distribution are known as the parameters of

distribution.

Binomial distribution is a discrete distribution as X can take only the integral values, viz., 0,

1,2 ... , n. Any variable which follows binomial distribution is known as binomial variate.

PHYSICAL CONDITIONS FOR BINOMIAL DISTRIBUTION: We get the binomial

distribution under the following conditions.

(i) Each trial results in two mutually disjoint outcomes termed as success and failure.

(ii) The number of trials n is finite.

(iii) The' trials are independent of each other.

(iv) The probability of success p is constant for-each trial.

The problems relating to tossing of a corn or throwing of dice or drawing cards from a pack of

cards with replacement lead to binomial probability distribution.

Example: Ten coins are thrown simultaneously. Find the probability of getting at least seven

heads.
Solution: p = probability of getting head=1/2

q = probability of not getting head= 1/2

the probability of getting at least seven heads is given by

p( x ) = ¿ ( 10 ¿ ) ¿ ¿
¿

= ¿ ( 10 ¿ ) ¿ ¿
¿

probability of getting at least seven heads is given by

P( X≥7)=p (7)+ p(8 )+ p(9 )+ p (10)


176
=
1024

Example: In a precision bombing attack there is a 50% chance that any one bomb will strike the

target. Two direct hits are required to destroy the target completely. How many bombs must be

dropped to give a 99% chance or better of completely destroying the target?

Question 1: In 100 sets often tosses of an unbaised coin, in how many cases should we expect

(I) Seven heads and three tails,

(ii) at least seven heads?

Ans. (i) 12, (il) 17

Question 2:
a) In a book of 520 pages, 390 typo-graphical errors occur. Assuming Poisson law for the

number of errors per page, find the probability that a random sample of 5 pages will contain no

error.

b) Six coins are tossed 6,400 times. Using the Poisson distribution, find the approximate

probability of getting six heads x times.

Ans. a) e−3.75 b) (e ¿ ¿−100 .100 r)/r ! ¿

Question 3:

If the mean is 3 and variance is 4 of a random variable X , check whether X follows binomial

distribution.

Question 4:

Sixteen coins are thrown simultaneously. Find the probability of getting at least 7 heads.

THE POISSON DISTRIBUTION:

A random variable X is said to follow a Poisson distribution if it assumes only non-negative

values and its probability mass function is given by

e−λ λ x
( )
p( x , λ) = P( X=x) ¿ ¿ x ! ; x=0,1,2...; λ>0¿ ¿ ¿
¿
Here λ is known as the parameter of

the distribution

We shall use the notation X =P ( λ) to denote that X is a Poisson variate with parameter

λ.
Following are some instances where Poisson distribution may be successfully employed:

(1) Number of deaths from a disease (not in the form of an epidemic) Such as heart attack or ca

ncer or due to snake bite.

(2) Number of suicides reported in a particular city.

(3) The number of defective material in a packing manufactured by a good concern.

(4) Number of faulty blades in a packet of 100.

(5) Number of air accidents in some unit of time.

MOMENTS OF THE POISSON DISTRIBUTION:

μ'=λ
1 : mean

2
μ ' = λ +λ
2

μ ' = λ 3 +3 λ 2 + λ
3

4 3 2
μ ' = λ +6 λ +7 λ + λ
4

Thus mean and variance of Poisson distribution are each equal to λ.

Example: Six coins are tossed 6,400 times. Using the Poisson distribution, find the approximate

probability of getting six heads x times.

Solution. The probability of obtaining six beads in one throw of six coins (a single trial), is
6
1
p= ()
2 , assuming that head and tail are equally probable.

6
1
λ=np= 6400∗
2
=100 ()
Hence, using Poisson distribution , probability of getting six heads x times

e− λ λ x
(
P( X=x ) ¿ ¿ x ! ; x=0,1,2...; λ>0 ¿ ¿ ¿ )
¿

e−100 100 x
P( X= x ) ¿ ( x!
; x=0,1,2 ... )
Question: In a book of 520 pages, 390 typo-graphical errors occur. Assuming Poisson law for

the number of errors per page, find tile probability that a random sample of 5 pages will contain

no error.

THE EXPONENTIAL DISTRIBUTION:

A random variable X is said to be exponentially distributed with parameters λ , denoted by

X≈ Exp ( λ ) if its PDF is:

f X (x)=¿ { λe−λx x≥0 ¿} ¿{} ;

The corresponding distribution function is


F X (x)=¿ {1−e−λx x≥0 ¿ }¿ {} An exponential distribution is a suitable model in many

situations, like the time

until the next earthquake.

NORMAL DISTRIBUTION:

The normal distribution was first discovered in 1733 by English mathematician De-Moivre,

who obtained this continuous distribution as a limiting case of the binomial distribution and

applied it to problems arising in the game of chance.

Definition:

A random variable X is said to have a normal distribution with parameters μ (called "mean")

and σ 2 (called "variance") if its density function is given by the probability law:

2
1 −1 x−μ
f (x ; μ , σ )=
σ π √2
exp ( { })
2 σ ; −∞< x <∞ ; −∞< μ<∞ ;

 A random variable X with mean μ and variance σ and following the normal law is

expressed by X≈ N ( μ , σ 2 )

 Normal Distribution as a Limiting form of Binomial Distribution

 Normal distribution is asymmetrical

 For Normal distribution, Mean=Median=Mode


IMPORTANCE OF NORMAL DISTRIBUTION. Normal distribution plays a very

important role in statistical theory because of the following reasons :

(i) Most of the distributions occurring in practice, e.g., binomial, Poisson distributions

can be approximated by normal distribution.

(ii) Even if a variable is not normally distributed, it can sometimes be brought to normal

form by simple transformation of variable.

You might also like