Chapter 4
Chapter 4
Chapter 4
∑Y X
i 1i = β0 ∑ 1i 2 ∑ 1i
ˆ X + [βˆ +2βˆ ] X2
1
The number of β's to be estimated is greater than
the number of independent equations.
So, if two or more X's are perfectly correlated, it is
not possible to find the estimates for all β's.
i.e., we cannot find β̂1 & β̂2separately, but β̂1 + 2β̂2 .
α̂ = β̂1 + 2β̂2 =
∑YX i 1i − nX1Y
β̂0 = Y − [β̂1 + 2β̂2 ]X1
∑X 2
1i − nX12 &
4.3 Multicollinearity
2 2 2
var(βˆ j ) =
σ
= VIFj *
σ where R j is the R2 from
∑ ji
x 2
(1 − R 2
j ) ∑ ji
x 2
regressing Xj on all other X's.
4.3 Multicollinearity
B. A Formal Test:
The most-often used test for heteroskedasticity is
the Breusch-Pagan (BP) test.
H0: homoskedasticity vs. Ha: heteroskedasticity
Regress ũ2 on Ŷ or ũ2 on the original X's, X2's and,
if enough data, cross-products of the X's.
H0 will be rejected for high values of the test
statistic [n*R2~χ2q] or for low p-values.
n & R2 are obtained from auxiliary regression of ũ2
on q (number of) predictors.
4.5.1 Heteroskedasticity
∑
E(Y|X) = β + β X , was correctly specified.
0
i =1
i i