Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
17 views

Lecture 9 Simple Regression

Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Lecture 9 Simple Regression

Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 52

Business Statistics:

A Decision-Making Approach
7th Edition

Lecture 9
Introduction to Linear Regression

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-1


Chapter Goals

After completing this chapter, you should be


able to:
 Calculate and interpret the simple correlation between
two variables
 Determine whether the correlation is significant
 Calculate and interpret the simple linear regression
equation for a set of data
 Understand the assumptions behind regression
analysis

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-2


Scatter Plots and Correlation

 A scatter plot (or scatter diagram) is used to show


the relationship between two variables
 Correlation analysis is used to measure strength
of the association (linear relationship) between
two variables
 Only concerned with strength of the
relationship
 No causal effect is implied

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-3


Scatter Plot Examples
Linear relationships Curvilinear relationships

y y

x x

y y

x x
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-4
Scatter Plot Examples
(continued)
Strong relationships Weak relationships

y y

x x

y y

x x
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-5
Scatter Plot Examples
(continued)
No relationship

x
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-6
Correlation Coefficient
(continued)

 Correlation measures the strength of the


linear association between two variables
 The sample correlation coefficient r is a
measure of the strength of the linear
relationship between two variables, based
on sample observations

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-7


Features of r
 Unit free
 Range between -1 and 1
 The closer to -1, the stronger the negative
linear relationship
 The closer to 1, the stronger the positive
linear relationship
 The closer to 0, the weaker the linear
relationship

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-8


Examples of Approximate
r Values
y y y

x x x
r = -1 r = -.6 r=0
y y

x x
r = +.3
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. r = +1 Chap 14-9
Calculating the
Correlation Coefficient
Sample correlation coefficient:

r
 ( x  x )( y  y )
[ ( x  x ) ][  ( y  y ) ]
2 2

or the algebraic equivalent:


n xy   x  y
r
[n(  x 2 )  (  x )2 ][n(  y 2 )  (  y )2 ]
where:
r = Sample correlation coefficient
n = Sample size
x = Value of the independent variable
y = Value of the dependent variable
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-10
Calculation Example
Tree Trunk
Height Diameter
y x xy y2 x2
35 8 280 1225 64
49 9 441 2401 81
27 7 189 729 49
33 6 198 1089 36
60 13 780 3600 169
21 7 147 441 49
45 11 495 2025 121
51 12 612 2601 144
=321 =73 =3142 =14111 =713

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-11


Calculation Example
(continued)

Tree n xy   x  y
Height, r
y 70 [n(  x 2 )  (  x)2 ][n(  y 2 )  (  y)2 ]
60

8(3142)  (73)(321)
50 
40
[8(713)  (73)2 ][8(14111)  (321) 2 ]
30

 0.886
20

10

0
r = 0.886 → relatively strong positive
0 2 4 6 8 10 12 14
linear association between x and y
Trunk Diameter, x

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-12


Excel Output

Excel Correlation Output


Tools / data analysis / correlation…

Tree Height Trunk Diameter


Tree Height 1
Trunk Diameter 0.886231 1

Correlation between
Tree Height and Trunk Diameter

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-13


Introduction to
Regression Analysis
 Regression analysis is used to:
 Predict the value of a dependent variable based on
the value of at least one independent variable
 Explain the impact of changes in an independent
variable on the dependent variable
Dependent variable: the variable we wish to
explain
Independent variable: the variable used to
explain the dependent variable

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-14


Simple Linear Regression Model

 Only one independent variable, x


 Relationship between x and y is
described by a linear function
 Changes in y are assumed to be caused
by changes in x

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-15


Types of Regression Models
Positive Linear Relationship Relationship NOT Linear

Negative Linear Relationship No Relationship

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-16


Population Linear Regression

The population regression model:


Population Random
Population Independent Error
Slope
y intercept Variable term, or
Coefficient
Dependent residual

y  β0  β1x  ε
Variable

Linear component Random Error


component

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-17


Linear Regression Assumptions

 Error values (ε) are statistically independent


 Error values are normally distributed for any
given value of x
 The probability distribution of the errors is
normal
 The distributions of possible ε values have
equal variances for all values of x
 The underlying relationship between the x
variable and the y variable is linear

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-18


Population Linear Regression
(continued)

y y  β0  β1x  ε
Observed Value
of y for xi

εi Slope = β1
Predicted Value Random Error
of y for xi
for this x value

Intercept = β0

xi x
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-19
Estimated Regression Model
The sample regression line provides an estimate of
the population regression line

Estimated Estimate of Estimate of the


(or predicted) the regression regression slope
y value
intercept
Independent

ŷ i  b0  b1x variable

The individual random error terms ei have a mean of zero

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-20


Least Squares Criterion

 b0 and b1 are obtained by finding the values


of b0 and b1 that minimize the sum of the
squared residuals

e 2
  (y ŷ) 2

  (y  (b 0  b1x))
2

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-21


The Least Squares Equation
 The formulas for b1 and b0 are:
algebraic equivalent for b1:
 (x  x)(y  y)  x y
b1   xy 
 (x  x) 2
b1  n

 x 2

(  x ) 2

and n
or
b0  y  b1x
n xy   x  y
b1 
n x 2  ( x ) 2
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-22
Interpretation of the
Slope and the Intercept

 b0 is the estimated average value of y


when the value of x is zero

 b1 is the estimated change in the


average value of y as a result of a
one-unit change in x

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-23


Finding the
Least Squares Equation

 The coefficients b0 and b1 will usually be


found using computer software, such as
Excel or Minitab

 Other regression measures will also be


computed as part of computer-based
regression analysis

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-24


Simple Linear Regression
Example
 A real estate agent wishes to examine the
relationship between the selling price of a home
and its size (measured in square feet)

 A random sample of 10 houses is selected


 Dependent variable (y) = house price in $1000s

 Independent variable (x) = square feet

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-25


Sample Data for
House Price Model
House Price in $1000s Square Feet
(y) (x)
245 1400
312 1600
279 1700
308 1875
199 1100
219 1550
405 2350
324 2450
319 1425
255 1700

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-26


Regression Using Excel
 Data / Data Analysis / Regression

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-27


Excel Output
Regression Statistics
Multiple R 0.76211 The regression equation is:
R Square 0.58082
Adjusted R Square 0.52842 house price  98.24833  0.10977 (square feet)
Standard Error 41.33032
Observations 10

ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-28


Graphical Presentation
 House price model: scatter plot and
regression line
450
400
House Price ($1000s)

350
Slope
300
250
= 0.10977
200
150
100
50
Intercept 0
= 98.248 0 500 1000 1500 2000 2500 3000
Square Feet

house price  98.24833  0.10977 (square feet)


Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-29
Interpretation of the
Intercept, b0

house price  98.24833  0.10977 (square feet)

 b0 is the estimated average value of Y when the


value of X is zero (if x = 0 is in the range of
observed x values)
 Here, no houses had 0 square feet, so b0 = 98.24833
just indicates that, for houses within the range of
sizes observed, $98,248.33 is the portion of the
house price not explained by square feet

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-30


Interpretation of the
Slope Coefficient, b1

house price  98.24833  0.10977 (square feet)

 b1 measures the estimated change in the


average value of Y as a result of a one-
unit change in X
 Here, b1 = .10977 tells us that the average value of a
house increases by .10977($1000) = $109.77, on
average, for each additional one square foot of size

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-31


Least Squares Regression
Properties
 The sum of the residuals from the least squares
regression line is 0 (  (y yˆ )  0 )
 The sum of the squared residuals is a minimum
(minimized (y yˆ ) 2
) 
 The simple regression line always passes through the
mean of the y variable and the mean of the x
variable
 The least squares coefficients are unbiased
estimates of β0 and β1

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-32


Explained and Unexplained
Variation
 Total variation is made up of two parts:

SST  SSE  SSR


Total sum of Sum of Squares Sum of Squares
Squares Error Regression

SST   ( y  y )2 SSE   ( y  ŷ )2 SSR   ( ŷ  y )2


where:
y = Average value of the dependent variable
y = Observed values of the dependent variable
ŷ = Estimated value of y for the given x value
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-33
Explained and Unexplained
Variation
(continued)

 SST = total sum of squares


 Measures the variation of the y i values around their
mean y
 SSE = error sum of squares
 Variation attributable to factors other than the
relationship between x and y
 SSR = regression sum of squares
 Explained variation attributable to the relationship
between x and y

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-34


Explained and Unexplained
Variation
(continued)
y
yi  2 
SSE = (yi - yi ) y
_
SST = (yi - y)2

y  _2
_ SSR = (yi - y) _
y y

Xi x
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-35
Coefficient of Determination, R2
 The coefficient of determination is the portion
of the total variation in the dependent variable
that is explained by variation in the
independent variable

 The coefficient of determination is also called


R-squared and is denoted as R2

SSR
R  2 where 0 R 1
2

SST
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-36
Coefficient of Determination, R2
(continued)
Coefficient of determination
SSR sum of squares explained by regression
R  2

SST total sum of squares

Note: In the single independent variable case, the coefficient


of determination is

R r 2 2

where:
R2 = Coefficient of determination
r = Simple correlation coefficient
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-37
Examples of Approximate
R2 Values
y
R2 = 1

Perfect linear relationship


between x and y:
x
R2 = 1
y 100% of the variation in y is
explained by variation in x

x
R = +1
2

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-38


Examples of Approximate
R2 Values
(continued)

y
0 < R2 < 1

Weaker linear relationship


between x and y:
x
Some but not all of the
y
variation in y is explained
by variation in x

x
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-39
Examples of Approximate
R2 Values
(continued)

R2 = 0
y
No linear relationship
between x and y:

The value of Y does not


x depend on x. (None of the
R2 = 0
variation in y is explained
by variation in x)

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-40


Excel Output
SSR 18934.9348
Regression Statistics
R  2
  0.58082
Multiple R 0.76211 SST 32600.5000
R Square 0.58082
Adjusted R Square 0.52842 58.08% of the variation in
Standard Error 41.33032 house prices is explained by
Observations 10
variation in square feet
ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-41


Example: House Prices

House Price Estimated Regression Equation:


Square Feet
in $1000s
(x)
(y)
house price  98.25  0.1098 (sq.ft.)
245 1400
312 1600
279 1700
308 1875 Predict the price for a house
199 1100 with 2000 square feet
219 1550
405 2350
324 2450
319 1425
255 1700

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-42


Example: House Prices
(continued)
Predict the price for a house
with 2000 square feet:

house price  98.25  0.1098 (sq.ft.)

 98.25  0.1098(2000)

 317.85
The predicted price for a house with 2000
square feet is 317.85($1,000s) = $317,850
Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-43
Types of Estimation

Interpolated Estimate Extrapolated Estimate


is an estimation made is an estimation made
within the given data outside the given data
range. range.

Interpolated Estimate is always more


reliable than the Extrapolated Estimate.
Using the Regression Equation in House Price,
compute Y when X = 1500 & X = 3000

The regression equation is:


Y’ = 98.25+ 0.1098 X

House Price = 98.25+ 0.1098 (square feet)


= 98.25 + 0.1098 (1500)
= 262.95($‘000)=$262,950

House Price = 98.25 + 0.1098 (square feet)


= 98.25 + 0.1098 (3000)
= 427.65($’000) = $427,650
Types of Estimation
Below is the Interpolated
Estimate.
House Price = 98.25+ 0.1098 (square feet)
= 98.25 + 0.1098 (1500)
= 262.95($‘000)=$262,950

Below is the Extrapolated


Estimate.
House Price = 98.25 + 0.1098 (square feet)
= 98.25 + 0.1098 (3000)
= 427.65($’000) = $427,650
Types of Estimation

Why X = 3000 is not a reliable estimate


whereas X = 1500 is more reliable?

The minimum value of X is 1100 square feet &


the maximum value of X is 2450 square feet.

Because, X = 3000 is outside the given data range


while X = 1500 is within the given data range.
Chapter Summary

 Introduced correlation analysis


 Discussed correlation to measure the strength
of a linear association
 Introduced simple linear regression analysis
 Calculated the coefficients for the simple linear
regression equation
 Described measures of variation (R2)
 Discussed Type of Estimation

Business Statistics: A Decision-Making Approach, 7e © 2008 Prentice-Hall, Inc. Chap 14-48


Class Exercise 1
The Road Transport Department conducted a study to measure
the emergency stopping distances of cars travelling at various
speeds. The results were recorded as follows.
Travelling Speed Stopping Distances
(km per hour) (km)
70 0.12
90 0.17
50 0.06
110 0.23
100 0.21
90 0.19
60 0.08
80 0.15
70 0.14
100 0.19
Class Exercise 1

a) If the Road Transport Department wishes to


influence the stopping distance by controlling the
travelling speed, derive the regression line that
relates the travelling speeds to stopping distances.
b) Compute and interpret the coefficient of
correlation.
c) Compute and interpret the coefficient of
determination.
d) Predict the stopping distances of cars travelling at
120 km per hour. Is this estimate reliable? Why?
Class Exercise 2
A cost accountant derived the following data on the
weekly output of standard size boxes from a
factory. Week Output Total Cost
(in thousands) (in thousands)
1 20 60
2 2 25
3 4 26
4 23 66
5 18 49
6 14 48
7 10 35
8 8 18
9 13 40
10 8 33
Class Exercise 2

a) Calculate the Product Moment Correlation


Coefficient between the output and total cost.
b) Determine the least square regression line of
total cost on output.
c) What is the fixed cost of the factory?
d) Estimate the total cost for a particular week in
which it is planned to produce 25,000 standard
size boxes. Is this estimate reliable? Why?

You might also like