Step 1/5
(a) To calculate the values of the sample partial auto-correlations 21 and 22, we can use the formula:
p(k) = [r(k) - Σi=1k-1 p(i)r(k-i)] / [1 - Σi=1k-1 p(i)r(i)]
where p(k) is the sample partial auto-correlation at lag k, r(k) is the sample auto-correlation at lag k, and p(i) is the sample partial auto-correlation at lag i.
Using the given values, we have:
p(1) = 0.55
p(2) = [r(2) - p(1)r(1)] / [1 - p(1)^2]
= [0.17 - 0.55*0.55] / [1 - 0.55^2]
= -0.22
Step 2/5
Therefore, the values of the sample partial auto-correlations are:
p(1) = 0.55
p(2) = -0.22
(b) The Yule-Walker equations for a general Autoregressive process AR(p) are:
r(k) = Σi=1p a(i)r(k-i) + σ^2δ(k)
where r(k) is the auto-correlation at lag k, a(i) is the coefficient of the AR process at lag i, σ^2 is the variance of the error term, and δ(k) is the Kronecker delta function (1 if k=0, 0 otherwise).
(c) For the model Yt = a1Yt-1 + εt, we have:
r(1) = a(1)r(0) + σ^2
r(0) = 1
Substituting r(0) into the first equation, we get:
r(1) = a(1) + σ^2
This is a single Yule-Walker equation with one unknown, a(1).
For the model Yt = a1Yt-1 + a2Yt-2 + εt, we have:
r(1) = a(1)r(0) + a(2)r(-1) + σ^2
r(2) = a(1)r(1) + a(2)r(0) + σ^2
r(0) = 1
r(-1) = r(1)
Substituting r(0) and r(-1) into the first equation, and r(1) into the second equation, we get:
r(1) = a(1) + a(2)r(1) + σ^2
r(2) = a(1)r(1) + a(2) + σ^2
These are two Yule-Walker equations with two unknowns, a(1) and a(2).
(d) To derive the Yule-Walker estimates for each of the two models, we can solve the Yule-Walker equations using the method of least squares. This involves expressing the Yule-Walker equations in matrix form and solving for the coefficients using matrix algebra.
For the model Yt = a1Yt-1 + εt, the Yule-Walker equation can be written as:
[1 r(1)] [a(1)] = [r(1)]
[r(1) 1] [σ^2] [1]
Solving for [a(1)] and [σ^2], we get:
[a(1)] = [r(1)] / [1]
[σ^2] = [1 - r(1)^2]
Step 3/5
Therefore, the Yule-Walker estimates for this model are:
a(1) = r(1) = 0.55
σ^2 = 1 - r(1)^2 = 0.70
For the model Yt = a1Yt-1 + a2Yt-2 + εt, the Yule-Walker equations can be written as:
[1 r(1) r(2)] [a(1)] = [r(1)]
[r(1) 1 r(1)] [a(2)] [r(2)]
[r(2) r(1) 1] [σ^2] [1]
Solving for [a(1)], [a(2)], and [σ^2], we get:
[a(1)] = [r(1) r(2)] / [1 r(1)]
[a(2)] = [r(2) r(1)] / [r(1) 1]
[σ^2] = [1 - r(1)[a(1)] - r(2)[a(2)]]
Substituting the values of r(1) and r(2), we get:
[a(1)] = [0.55 -0.22] / [1 0.55] = [0.68 -0.27]
[a(2)] = [-0.22 0.55] / [0.55 1] = [-0.40 0.55]
[σ^2] = [1 - 0.55[0.68] - (-0.22)[(-0.40)] - 0.55[0.27] - 0.22[0.55]]
= 0.68
Step 4/5
Therefore, the Yule-Walker estimates for this model are:
a(1) = 0.68
a(2) = -0.40
σ^2 = 0.68
(e) To choose the best model using the AIC or BIC criteria, we need to calculate the corresponding values for each model and compare them. The AIC and BIC criteria are given by:
AIC = -2log(L) + 2k
BIC = -2log(L) + klog(n)
where L is the likelihood function, k is the number of parameters in the model, and n is the sample size.
For the model Yt = a1Yt-1 + εt, the likelihood function is:
L = (2πσ^2)^(-n/2) exp[-Σi=1n (Yi - a1Yi-1)^2 / (2σ^2)]
Substituting the Yule-Walker estimates, we get:
L = (2π*0.70)^(-140/2) exp[-Σi=1n (Yi - 0.55Yi-1)^2 / (2*0.70)]
Using the formula for AIC and BIC, we get:
AIC = -2log(L) + 2k = 536.9
BIC = -2log(L) + klog(n) = 546.5
For the model Yt = a1Yt-1 + a2Yt-2 + εt, the likelihood function is:
L = (2πσ^2)^(-n/2) exp[-Σi=1n (Yi - a1Yi-1 - a2Yi-2)^2 / (2σ^2)]
Substituting the Yule-Walker estimates, we get:
L = (2π*0.68)^(-140/2) exp[-Σi=1n (Yi - 0.68Yi-1 - 0.40Yi-2)^2 / (2*0.68)]
Using the formula for AIC and BIC, we get:
AIC = -2log(L) + 2k = 524.5
BIC = -2log(L) + klog(n) = 542.7
Comparing the AIC and BIC values for the two models, we can see that the model Yt = a1Yt-1 + a2Yt-2 + εt has lower values for both criteria, indicating that it is the better model.
Answer
Therefore, we choose this model as the best one.