Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Probability and Statistics ch7

Download as pdf or txt
Download as pdf or txt
You are on page 1of 19

Eng.

Dana Hisham Qadan


⦁ Population Parameter : Objective need to get
information about population parameters
based on a random sample drawn from the
population under study
ESTIMATION
▪ Statistic: a numerical measure obtained from
the sample, for example, the sample
mean X is a statistic
▪ Parameter: a numerical measure obtained
from the population, for example, the
population mean μ is a parameter.
Estimation → use of sample data (statistics) to
estimate population parameters

Types of Estimation:
1.Point Estimation
2.Interval Estimation
Point Estimation
A point estimate is a single value of a statistic used
to estimate a population parameter.
For example, the sample mean, 4
n

X1 + X 2 +... + X n 
Xi
X= = i=1
n n
is an estimator for the population mean μ.
⦁ Statistical inference is the process by which
we inference population properties from
sample properties.

⦁ There are two types of statistical inference:


1. Estimation
2. Hypotheses Testing
⦁ The concepts involved are actually very
similar, which we will see in due course.
Below, we provide a basic introduction to
estimation.
⦁ Estimation : The objective of estimation is to
approximate the value of a population
parameter on the basis of a sample statistic.
⦁ For example, the sample mean X is used to
estimate the population mean μ.
⦁ There are two types of estimators:

1. Point Estimator
2. Interval Estimator
⦁ A point estimator draws inferences about a
population by estimating the value of an
unknown parameter using a single value or
point
⦁ An interval estimator draws inferences about
a population by estimating the value of an
unknown parameter using an interval. Here,
we try to construct an interval that “covers”
the true population parameter with a
specified probability.
⦁ As an example, suppose we are trying to
estimate the mean summer income of
students. Then, an interval estimate might
say that the (unknown) mean income is
between $380 and $420 with probability
0.95.
• How good the estimator X ?
Look to its probability distribution!
Important properties of the distribution of the sample
mean X
1. Mean
2. Variance or Standard Deviation
3. Shape
Properties of Estimators
1. Unbiasedness
Example: A statistics class has six students,
ages are:
18, 18, 19, 20, 20, 21
The population mean
18 +18 +19 + 20 + 20 + 21 58 10
= = 19.33
6 3
Select a samples of size 2 (n = 2).
There are 15 possible samples.
Find the mean of every possible sample
where n = 2:
11
Sampling distribution for the sample mean X

x 18 18.5 19 19.5 20 20.5

P(x) 1/15 2/15 4/15 3/15 3/15 2/15

The mean (expected value) of the sample X

E(X)  X =18(1/15) +18.5(2/15) +19(4/15) +19.5(3/15) +


= 20(3/15) + 20.5(2/15) 19.33
The mean value of X is the same as the population
mean 
E( X ) = 
X is an called an unbiased estimator for 
12
An estimator is unbiased if the mean of the
sampling distribution of the estimator is equal to
the parameter. An estimator  is unbiased for a
parameter θ if

E () = 
If this property is not satisfied, is biased
estimator for θ.

Result:
Let X1, X2,…, Xn be random sample
from a population with mean µ and
variance σ2.
Consider: The sample mean

n
X i

X = i=1
n

The sample Variance


n

(X i − X )2 13
S2 = i=1

n −1

Then

1. X is an unbiased estimator for µ,


E(X ) = 
Proof:

1 

n
E(X ) = E  X i
n i =1 
n

E( X ) =
1
n
E( X i )
i =1
n

E(X ) =
1
n

i =1

1
E(X ) = n = 
n
2. S2 is an unbiased estimator for σ2 14
 n 2
  (X i − X ) 
E(S 2 ) = E  i=1  =2
 n −1 
 

It can be shown that


n 
E  (X i − X )2  = (n −1) 2
 i =1 

then
 n 2
  (X i − X ) 
E(S 2 ) = E  i=1  =2
 n −1 
 

But
 n 2
  (X i − X )  n −1
E i=1 = 2 2
 n  n
 

Then 15
n

 (X i − X )2
i=1

is biased estimator for σ2.


2. Consistent Estimators
An unbiased estimator  is consistent if
−  → 0, as n → 
To check the condition
−  → 0 , a s n →  , we need to
check the limiting behavior of Va r ( ) .
Consistent estimator is one for which estimates
close to the value of the population parameter
increases as sample size increases.
Result:
2 16
Var ( X ) =
n
Proof.

1 n X 
Va r ( X ) = Va r   i
 i =1
n 
1 n
Var( X ) = 2 Var( X i )
n i =1
n
n 2

1
Va r ( X ) = 2 2
=
n i=1
n2

2
Va r ( X ) = → 0, as n → 
n
The sample mean X is unbiased estimator
for μ, then X is consistent estimator for µ
Standard Deviation (S.D) of the estimator
X 17
S .D ( X ) =

n
To estimate σ2:
n

 (X i − X )2
2 = S 2 = i=1
n −1
Estimate σ by S (the sample S.D)

 = S = S2
Standard Error (S.E) of the estimator X
s
S .E ( X ) =
n
The smaller the sampling variability (i.e., 18
the S.E) is, the better an estimate will be.
As the sample size increases to infinity, the
sampling distribution concentrates around
the population mean.
Result: Let X1, X2,…, Xn be random sample
from normal population with mean μ and
variance σ2 , N (  ,  )
2

• X is an unbiased estimator for µ


• X is a consistent estimator for µ
2
X : N (, )
• n
X is the BEST estimator for µ, it has the
smallest variance among the class of all
unbiased estimators.

19

You might also like