Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
2 views

Chapter 7_Point Estimation of Parameters and Sampling Distributions

Chapter 7 covers point estimation of parameters and sampling distributions, focusing on the concepts of estimating population parameters, the role of the normal distribution, and the central limit theorem. It outlines methods for constructing point estimators, including the method of moments, maximum likelihood, and Bayesian approaches, while also discussing properties like bias and variance. The chapter emphasizes the importance of sampling distributions and provides examples to illustrate the application of these statistical concepts.

Uploaded by

baotochi87
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Chapter 7_Point Estimation of Parameters and Sampling Distributions

Chapter 7 covers point estimation of parameters and sampling distributions, focusing on the concepts of estimating population parameters, the role of the normal distribution, and the central limit theorem. It outlines methods for constructing point estimators, including the method of moments, maximum likelihood, and Bayesian approaches, while also discussing properties like bias and variance. The chapter emphasizes the importance of sampling distributions and provides examples to illustrate the application of these statistical concepts.

Uploaded by

baotochi87
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 39

Chapter 7:

Point Estimation of Parameters


and
Sampling Distributions
Chapter outline
7.1 Point Estimation.
7.2 Sampling Distributions and the Central Limit
Theorem.
Learning Objectives
After careful study of this chapter, you should be
able to do the following:
1. Explain the general concepts of estimating the
parameters of a population or a probability
distribution
2. Explain the important role of the normal
distribution as a sampling distribution
3. Understand the central limit theorem
Learning Objectives
4. Explain important properties of point
estimators, including bias, variance, and mean
square error
5. Know how to construct point estimators using
the method of moments and the method of
maximum
likelihood
6. Know how to compute and explain the
precision with which a parameter is estimated
7. Know how to construct a point estimator using
the Bayesian approach
7.1 Point Estimation.
7.1.1 Introduction
 The field of statistical inference consists of those
methods used to make decisions or to draw
conclusions about a population.
These methods utilize the information contained in
a sample from the population in drawing
conclusions.
Statistical inference may be divided into two major areas:
• Parameter estimation
• Hypothesis testing
7.1 Point Estimation.
7.1.2 Point Estimation.
Suppose that we want to obtain a point estimate (a
reasonable value) of a population parameter. We
know that before the data are collected, the
observations are considered to be random variables,
say, . Therefore, anyxnfunction of the
x1 , x2 ,...,
observation, or any statistic, is also a random
variable. For example, the sample mean and the
sample variance
x are statistics s2
and random variables.
7.1 Point Estimation.
7.1.2 Point Estimation.
Definition
A point estimate of some population parameter is a

single numerical value of a statistic
 . 
 
Example: Suppose that the random variable X is
normally distributed with an unknown mean

The sample mean is a point estimator of the unknown
population mean . That is,

After the sample has been selected,  X

the numerical
value is the point estimate of .
x 
7.1 Point Estimation.
Thus, if x1 25, x2 30, x3 29, x4 31
the point estimate of  is
25  30  29  31
x 28.75
4
Similarly, if the population variance  is also
2

unknown, a point estimator for  is the sample


2
2 2
variance s , and the numerical value s 6.9
calculated from the sample data is called the
point estimate of  2
.
7.1 Point Estimation.
7.1.2 Point Estimation.
Definition
is called the point estimator
The statistic 
Estimation problems occur frequently in engineering.
We often need to estimate:
The mean μ of a single population.
The variance (or2 standard deviation ) of a single
population.  
7.1 Point Estimation.
7.1.2 Point Estimation.
The proportion p of items in a population that
belong to a class of interest
The difference in means of two populations,
1  2
 The difference in two population proportions,
p1  p2
7.1 Point Estimation.
7.1.2 Point Estimation.
Reasonable point estimates of these parameters are
as follow:
For , the estimate is   x, the sample mean.
For ,2 the estimate is  s,2 the sample variance.
 x
For p , the estimate is p  n, the sample proportion,
where xis the number of items in a random sample
of size n to the class of interest.
that belong
7.1 Point Estimation.
7.1.2 Point Estimation.
 For    ,the estimate is     x  x the
1 2 1 2 1 2,

difference between the sample means of two


independent random samples.
For p1  p2 ,the estimate is 
p1  p2 ,the difference
between two sample proportions computed from
two independent random samples.
7.1 Point Estimation.
Example 1:
There are two ponds containing lots of fish, a random
sample of 20 fish were selected from each pond and record
their weight. The results are as follows:
S1: 1.2, 3.0, 2.3, 1.0; 1.9; 2.1; 1.4; 2.2; 0.7; 1.3; 0.5; 0.8;
2.3; 3.3; 4.1; 3.5; 2.7; 1.3; 3.0; 1.4.
S2: 1.0, 2.3, 1.3; 1.5; 0.3; 1.6; 2.3; 2.6; 3.3; 4.2; 0.8; 2.8;
3.7; 0.5; 4.1; 3.3; 2.1; 3.6; 1.8; 2.1.
a/ Estimated average weight, variance, standard deviation
and standard rate of fish (weight > 2 kg) in each pond.
b/ Compare the weight average, variance, standard
deviation of the number of fish in two ponds and the
standard rate of fish in two ponds.
7.1 Point Estimation.
7.1 Point Estimation.
Solution:
2
a/ X 1 2; s 1.042 ; s1 1.0208
1
10
standard rate of fish (weight > 2 kg): 0.5
20

2
X 2 2.26; s 1.392 ; s2 1.18
2

12
standard rate of fish (weight > 2 kg): 0.6
20
7.1 Point Estimation.
Example 2(HW): Like hurricanes and earthquakes,
geomagnetic storms are natural hazards with
possible severe impact on the Earth.
Severe storms can cause communication and utility
breakdowns,leading to possible blackouts. The
National Oceanic and Atmospheric Administration
beams electron and proton flux data in various
energy ranges to various stations on the Earth to
help forecast possible disturbances. The following
are 25 readings ofproton flux in the 47-68 kEV
range (units are in p / (cm2-sec-sterMeV)) on the
evening of December 28, 2011:
7.1 Point Estimation.

2310, 2320, 2010, 10800, 2190, 3360, 5640, 2540,


3360, 11800, 2010, 3430, 10600, 7370, 2160, 3200,
2020, 2850, 3500, 10200, 8550, 9500, 2260, 7730,
2250.
a/ Find a point estimate of the mean proton flux in
this time period.
b/ Find a point estimate of the standard deviation of
the proton flux in this time period.
c/ Find an estimate of the standard error of the
estimate in part (a).
7.2 Sampling Distributions and the
Central Limit Theorem.
Statistical inference is concerned with making
decisions about a population based on the
information contained in a random sample from
that population.
7.2 Sampling Distributions and
the Central Limit Theorem.
7.2.1 Definitions:
 The random variables X , X ,..., X are a
1 2 n
random sample of size n if (a) the X ’si are
independent random variables and (b) every X i
has the same probability distribution.
 A statistic is any function of the observations in
a random sample.
The probability distribution of a statistic is called
a sampling distribution.
7.2 Sampling Distributions and
the Central Limit Theorem.
7.2.2 Central Limit Theorem

 , 2
 X , 2
???
X

is the sample mean.


X  x1 , x 2 ,.... 
7.2 Sampling Distributions and
the Central Limit Theorem.
7.2.2 Central Limit Theorem

If we are sampling from a population that has an unknown


probability distribution, the sampling distribution of the sample mean
will still be approximately normal with mean μ and variance if
the sample size n is large. This is one of the most useful theorems in
 2

statistics, called the central limit theorem. The statement is as


n
follows:
7.2 Sampling Distributions and
the Central Limit Theorem.
7.2.2 Central Limit Theorem
If X 1 , X 2 ,..., X nis a random sample of size n taken
from a population (either finite or infinite) with mean μ
and finite variance and if  2is the sample
X mean, the
limiting form of the distribution of

X 
Z
/ n
as n → ∞, is the standard normal distribution.
7.2 Sampling Distributions and
the Central Limit Theorem.
Conclusion:
The distribution of sample will, as the sample
X
size increases, approach a normal distribution.
The mean of the sample mean  X 
The standard deviation of all sample mean is

X 
n
Ifthe orginal population is normal distributed, then
for any samples size , the sample
n mean will be
normal distributed.
7.2 Sampling Distributions and
the Central Limit Theorem.
Conclusion:
For sample of size n  30, the distribution of the
sample means can be approximated reasonanbly
well by a normal distribution.
The approximation becomes closer to a normal
distribution as the sample size n becomes larger.
7.2 Sampling Distributions and
the Central Limit Theorem.
Note:
- If the underlying distribution is symmetric and unimodal
(not too far from normal), the central limit theorem will
apply for small values of n, say 4 or 5.
- As a general guideline, if n > 30, the central limit
theorem will almost always apply.
7.2 Sampling Distributions and
the Central Limit Theorem.
Example 1:
An electronics company manufactures resistors
that have a mean resistance of 100 ohms and a
standard deviation of 10 ohms. The distribution of
resistance is normal.
Find the probability that a random sample of n 25
resistors will have an average resistance of fewer
than 95 ohms ?
7.2 Sampling Distributions and
the Central Limit Theorem.
Solution:
The sampling distribution of X is normal with
mean  X 100 ohms and a standard deviation of
 10
X   2
n 25

Therefore, X  N 100, 4 

 
So P X  95 0.0062
7.2 Sampling Distributions and
the Central Limit Theorem.
Solution:
X  X 95  100
Z   2.5
X 2

 
P X  95 P Z   2.5  0.0062
7.2 Sampling Distributions and
the Central Limit Theorem.
Practical Conclusion:
This example shows that if the distribution of
resistance is normal with mean 100 ohms and
standard deviation of 10 ohms, finding a random
sample of resistors with a sample mean less than
95 ohms is a rare event. If this actually happens,
it casts doubt as to whether the true mean is really
100 ohms or if the true standard deviation is really
10 ohms.
7.2 Sampling Distributions and
the Central Limit Theorem.
Example 2:
Suppose that a random variable X has a continuous uniform
distribution

1 / 2, 4  x 6
f  x  
0, otherwise
Find the distribution of the sample mean of a random
sample of size n = 40 ?
7.2 Sampling Distributions and
the Central Limit Theorem.
Solution: The mean and variance of X are
a b 4 6
  5
2 2
b  a  6  4  1
2 2

 
2
 
12 12 3
The central limit theorem indicates that the
distribution of X is approximately normal with mean
So,  X
  5;  2
X
 2
/ n 1/ 120
 1 
X ~ N  5, 
 120 
7.2 Sampling Distributions and
the Central Limit Theorem.
Example 3:
A normal population has mean 100 and variance
25. How large must be the random sample be
if we want the standard error of the sample mean
to be 1.5 ?
7.2 Sampling Distributions and
the Central Limit Theorem.
Solution:
 100
 25
 X 1.5
n ?
We have

 X 1.5   n 11.11
n
Then round up and n=12
7.2 Sampling Distributions and
the Central Limit Theorem.
Example 4:
Suppose that samples of size n = 25 are selected at
random from a normal population with mean 100 and
standard deviation 10.
What is the probability that the sample mean falls in the
interval from  X  1.8 X to  X  1.0 X ?
7.2 Sampling Distributions and
the Central Limit Theorem.

Solution: Let X ~ N 100,10 2

n 25    X  100
 
 100     10  X ~ N 100, 2 
2

 10   X  n  25 2

 X  1.8 X 100  1.8 2 96.4
 X  1.0 X 100  2 102

 
So, P 96.4  X  102 0.8054
7.2 Sampling Distributions and
the Central Limit Theorem.
Now consider the case in which we have two
independent populations.
Let the first population have:   X1 1
2  X1 :  2
mean 1 and variance  1  X1  1 / n1
2

Let the second population have:   X 2 2


  2
mean 2 and variance 2  X 2 :  2
 X 2  2 / n2
2

The sampling distribution of X 1  X 2 is normal 2with 2


mean      
X1  X 2 X1 X2 ;  2
X1  X 2
 2
X1
  2
 1
 2
X2
n1 n2
1  2
7.2 Sampling Distributions and
the Central Limit Theorem.
Approximate Sampling Distribution of a
Difference in Sample Means
If we have two independent population with means and
1
and variances and2 and2 if and are the sample means of two
2
independent randomsamples
1 of2 sizes X 1 X2
and from these populations, then the sampling distribution of
n1 n2
X 1  X 2  1  2 
Z  normal if the conditions of the central limit
is approximately standard
 2
1 / n
theorem apply. If the two populations1   2
/ n2 the sampling
are2 normal,
distribution of Z is exactly standard normal.
7.2 Sampling Distributions and
the Central Limit Theorem.
Example: The effective life of a component used in a jet-turbine aircraft
engine is a random variable with mean 5000 hours and standard deviation 40
hours. The distribution of effective life is fairly close to a normal distribution.
The engine manufacturer introduces an improvement into the manufacturing
process for this component that increases the mean life to 5050 hours and
decreases the standard deviation to 30 hours. Suppose that a random sample of
components is selected from the “old” process and a random sample of
components is selected from the “improved” process.
What is the probabilitynthat
 the
25 difference in the two samples means
2
is at least 25 hours ?
n1 16

X 2  X1
7.2 Sampling Distributions and
the Central Limit Theorem.
Solution:
1 5000; 1 40; n1 16
2 5050; 2 30; n2 25

P X 2  X 1 25 ? 

We have: X 5000;  2
X
10 2
1 1

 X 5050;  2
X2
6 2
2

Then  50; 2
136
X 2  X1 X 2  X1
7.2 Sampling Distributions and
the Central Limit Theorem.
Solution:
X 2  X 1  N 50,136 
So
 
P X 2  X 1 25 0.9838

You might also like