Models For Stationary Time Series: T T T PTP T
Models For Stationary Time Series: T T T PTP T
Models For Stationary Time Series: T T T PTP T
The current value of the series Yt is a linear combination of the ' p ' most recent past
values of itself plus an ‘innovation’ term zt that incorporates everything new in the
series at time ' t ' that is not explained by the past values.
Thus, for every, ' t ' , we assume that zt is independent of, Yt −1 , Yt −2 etc..
Alternatively:-
The AR ( p ) process could be written as
( )
N 0,σ z2 and not correlated with {Yt }
1
5.2.1 AR(1) Model
Assume that the series is stationary, and that a current value of the series is linearly
dependent upon its previous value, with some error. This linear relationship is
Yt = α Yt −1 + Z t (5.2.2)
where, Z t is a white noise; (that is, the zt ' s are a sequence of uncorrelated random
variables, possibly normally distributed, but not necessarily normal) with mean 0 and
variance σ z2 .
This model is called an autoregressive of order 1. Here, we assume that the process
mean has been subtracted out so that the series mean is zero.
(a) E (Yt ) = E (α Yt −1 + Z t ) = 0
γ 0 = α 2γ 0 + σ z2
2
σ z2
Solving for γ 0 : γ0 = (5.2.3)
1−α 2
ασ z2
when k = 1 : γ 1 = αγ 0 =
1−α 2
α 2σ z2
when k = 2 : γ 2 = αγ 1 =
1−α 2
etc…
In general:
σ z2
γk = α k
2
1−α
γk
ρk = =α
k
for k = 1, 2 , 3, ...
γ0
3
Remarks
Since, α < 1, the magnitude of the ACF decreases exponentially as the
number of lags, k increases
Several autocorrelation functions are displayed below. Notice that for α near ±1 the
(
exponential decay is quite slow 0.96 = 0.53 ) for smaller α the decay is quite rapid
( 0.4
6
)
= 0.0041
With α near ±1 strong correlation will extend over many lags and produce a
relatively smooth series and a very jagged series if α is negative.
α = 0.9 α = 0.4
α = −0.8 α = −0.5
4
The diagrams below show a plot of a simulated AR(1) process with α = 0.9 . Notice,
the infrequently how the series crosses its theoretical mean of zero.
Also, how it hangs together, remaining on the same side of the mean for extended
periods. One might claim that the series has several trends, but in fact, the theoretical
mean is zero for all time points.
6
4
4
Yt
Yt
2
2
0
0
-4
-4
0 20 40 60 80 100 -4 -2 0 2 4 6
Time Yt−1
6
6
4
4
Yt
Yt
2
2
0
0
-4
-4
-4 -2 0 2 4 6 -4 -2 0 2 4 6
Y t−2 Yt−3
5
The smoothness of the series and the strong autocorrelation at lag 1 is seen in the lag
plot Yt vs Yt −1
R-code
library(TSA)
set.seed(12345)
ar1.s=arima.sim(model=list(ar=c(0.9)),n=100)
par(mfrow=c(2,2))
plot(ar1.s,ylab=expression(Y[t]),xlab="Time",col="blue",type='o')
abline(h=0,lty=2)
plot(y=ar1.s,x=zlag(ar1.s,d=1),ylab=expression(Y[t]),xlab=expression(Y[
t-1]),type='p',col=2)
plot(y=ar1.s,x=zlag(ar1.s,d=2),ylab=expression(Y[t]),xlab=expression(Y[
t-2]),type='p',col=3)
plot(y=ar1.s,x=zlag(ar1.s,d=3),ylab=expression(Y[t]),xlab=expression(Y[
t-3]),type='p',col=4)
6
Worked Illustration 1
Yt = α1Yt −1 + Z t (5.2.5)
i.e. φ ( B ) Yt = Zt where φ ( B ) = 1 − α B
Stationarity:
p (λ ) = λ − α = 0 ⇒ λ = α ie α <1
⇒ ρk = A ⋅α k (solution of LDE)
When k = 1:
First Y-W equation becomes-:
ρ1 = αρ0 ⇒ ρ1 = α
7
Wold equations:
σ2
ρ k − αρ k +1 = ak z
for k = 0,1, 2,...
σ y2
σ z2 σ z2
1st wold eqn: 1 − αρ1 = 2 ∴ γ 0 =σ y2 = using ρ1 =α
σy 1−α 2
Wold’s Theorem
∞
where (
Zt ~ N 0, σ z2 ) and ∑ bi < ∞
i =0
The correct ‘model’ for any covariance-stationary series is some infinite distributed lag
of white noise (wold representation).
8
5.2.2 AR(2) Model
Consider the series:
Yt = α1Yt −1 + α 2Yt −2 + Z t (5.2.6)
i.e.
φ ( B ) Yt = Zt where φ ( B ) = 1 − α1B − α 2 B 2
α1 ± α12 + 4α 2
The roots are found to be:
2
For stationarity, we require that these roots be outside the unit circle, that is, exceed 1
in absolute value.
Thus forming the triangular region or the statonarity region given below
9
α2
real roots if
α12 + 4α 2 < 0
α12 + 4α 2 > 0
1 2
α1
3 4
1
Note: φ ( B ) = (1 − λ1B )(1 − λ2 B ) → roots are B =
λi
To derive the autocorrelation function, using equation (5.2.6) multiply both sides by
Yt − k and take expectations.
or dividing through by γ 0 :
ρ k = λ1ρ k −1 + λ2 ρ k − 2 for k = 1, 2,3,.... (5.2.8)
10
As a general linear process: we can get ai ' s either from
( ) ( )
−1
1 − α1B + α 2 B 2 = 1 + α1B + α12 + α 2 B 2 + ....
or from
(1 − λ1B )−1 (1 − λ2 B )−1 = 1 + ( λ1 + λ2 ) B + ( λ12 + λ22 + λ1λ2 ) B 2 + ....
Yule-Walker equations
ρ k = α1ρ k −1 + α 2 ρ k −2 for k = 1, 2,3,.....
⇒ ρ k = A1λ1k + A2λ2k
α α12 + α 2 (1 − α 2 )
So: ρ1 = 1 and ρ2 =
1− α2 1 − α2
Alternative:
Take the variance of both side of AR(2) process, in equation (5.2.6)
Var (Yt ) = Var (α1Yt −1 ) + Var (α 2Yt − 2 ) + 2Cov (α1Yt −1 , α 2Yt −2 ) + Var ( Z t )
11
Below are plots of simulated AR(2) processes with ACFs of different values.
6
4
2
Yt
0
-2
-4
0 20 40 60 80 100 120
Time
Notice the periodic behaviour of the process; this will also be seen in the
correlogram.
12
5.2.3 AR(p) Model
The general autoregressive process of order p is given by
p
or Yt = ∑ α iYt −i + Z t (5.2.10)
i =1
Theoretical Properties
(i). Stationarity
Unlike the MA model there are restrictions on the parameters, α i : i = 1,...., p
13
And, AR characteristic equation:
λ p − α1λ p−1 − α 2λ p−2 − ..... − α p−1λ − α p = 0
p
Complementary solution: Ct = ∑ Ai λit
i =1
∞
I t = ∑ ai Zt −i ( a0 = 1) φ −1 ( B ) = Z t
i =0
Recall: λi ' s are roots of p ( λ ) = 0 and are the reciprocals of the roots of
φ ( B) = 0 as p ( λ ) = λ pφ ( λ1 )
So roots of φ ( B ) = 0 lie outside the unit circle and λi < 1 also means that
∑ ai2 < ∞ in the inverted general linear process form, which is required for
stationarity
14
(ii). Inverted form: Yt = φ −1 ( B ) Z t
Suppose Yt = a0 + α Yt −1 + Z t
= a0 + α ( a0 + α Yt −2 + Z t −1 ) + Zt
= a0 (1 + α ) + α 2Yt − 2 + α Z t −1 + Z t
= a0 (1 + α ) + α 2 ( a0 + α Yt −3 + Z t − 2 ) + α Z t −1 + Z t
=
⋮
a0
⇒ + Z t + α Zt −1 + α 2 Zt −2 + ....
1−α
Alternative
Yt − α Yt −1 = a0 + Zt
(1 − α B ) Yt = a0 + Zt
a0 −1
Yt = + (1 − α B ) Z t
(1 − α )
15
(iii). Mean
E (Yt ) = 0 since we subtracted the mean from the series
We could simple introduce a non-zero mean:- φ ( B ) (Yt − µ ) = Zt
Equivalently:- φ ( B ) Yt = v + Zt
p
where v = φ ( B ) µ = φ (1) µ = 1 − ∑ α i µ
i =1
(
where a0 = µ 1 − α1 − α 2 − ... − α p )
(iv). Variance
Multiply Equation (3.2.9) by Yt + k : k = 0,1, 2,... and take expectations
p
∴ γ k − ∑ α iγ k +i = E (Yt + k Z t )
i =1
16
For RHS we use the inverted form for Yt + k
p
∴ γ k − ∑ α iγ k +i = akσ z2
i =1
function of α i ' s
Divide by γ 0 :
p σ 2
ρ k − ∑ α i ρ k −i = ak z2 for k = 0,1, 2,...
σ y
i =1
p
σ z2
when k = 0 : ρ 0 − ∑ α i ρi = 2
i =1 σy
σ z2
∴ σ y2 = p
1 − ∑ α i ρi
i =1
17
(v). Autocovariance and Autocorrelation
We cannot consider the variance separately. Why? Since Yt is dependent on
Z t , Zt −1 ,..... but independent of Z t +1 , Zt + 2 ,....
To obtain γ k , ρ k
Multiply both sides of the Equation (3.2.10) by Yt − k for k = 1, 2,3,... and take
expectations
p
∴ γ k = ∑ α i γ k −i for k = 1, 2,3...
i =1
Divide by γ 0 :
p
∴ ρ k = ∑ α i ρ k −i for k = 1, 2,3...
i =1
These are the “Yule-Walker” equations and are the homogeneous form of the
LDE of the process
p
∴ ρ k = ∑ Ai λik
i =1
18
(vi). Identification
The PACF “cut off” after lag p , that is, the number of non-zero partial
autocorrelation gives the order of the AR model.
The ACF The “order of the model” mean the most extreme lag of x that is
used as a predictor.
The ACF does not “cut off”, but instead tapers toward zero.
0.8
0.6
0.6
Partial ACF
0.4
0.4
ACF
0.2
0.2
0.0
0.0
-0.2
-0.2
5 10 15 20 5 10 15 20
Lag Lag
i = 1, 2,... and p → ∞
19
5.2.3 Random Walk Model
The random walk hypothesis is a financial theory stating that stock market prices evolve
according to a random walk and thus cannot be predicted.
(
Zt ~ N 0, σ z2 )
The random walk is therefore not weakly stationary thus, we call it a unit-root
nonstationary time series
The random walk model, is model behind the naïve method
It has long periods of apparent trends up and down
Sudden and unpredictable changes in direction
If we introduce a constant, then the model becomes a random walk with a drift
Yt = µ + Yt −1 + Z t
o The term µ of the model is the time trend of the series and is referred
to as the drift of the model, for example
Y1 = µ + Y0 + Z1
Y2 = µ + Y1 + Z 2 = 2 µ + Y0 + Z 2 + Z1
⋮
Yt = t µ + Y0 + Z t + Zt −1 + ... + Z1
20
Example 5.2.1
(i). Show that the AR(2) process Yt = −0.5Yt −1 + 0.14Yt −2 + Z t is stationary
17 112
ρk = ( 0.2 ) + ( −0.7 )
k k
k = 0,1, 2,...
129 129
Solution 5.2.1
(i) AR characteristic equation:
λ 2 + 0.5λ − 0.14 = 0
100λ 2 + 50λ − 14 = 0
ρ k = A ( 15 ) + B ( 10 )
k −7 k
Hence, (2)
ρ1 (1 − 0.14 ) = −0.5
21
−25
∴ ρ1 =
43
From (2):
k = 0 : ρ0 = A + B ⇒ A + B = 1
17 112
Solving: A= B=
129 129
k k
17 1 112 −7
∴ ρk = +
129 5 129 10
-0.5
-1.0
0 2 4 6 8 10
22
Example 5.2.2
(
Calculate the mean and variance for the following process, where Z t ~ N 0, σ z2 )
(a) Yt = 4.5224 + 0.6909Yt −1 + Z t
(b) Yt = 68 − 0.5Yt − 2 + Z t
Solution 5.2.2
(a) E (Yt ) = 4.5224 + 0.6909 E (Yt −1 )
µ (1 − 0.6909 ) = 4.5224
4.5224
µ= = 14.6309
0.3091
γ 0 = 0.69092 γ 0 + σ z2
σ z2
γ0 = = 1.9133σ z2
0.52267
E (Yt ) = 68 − E ( 0.5Yt −2 )
µ = 68 − 0.5µ
23
68 136
µ= = = 45.3333
(1 + 0.5) 3
Var (Yt ) = Var ( 68 − 0.5Yt −2 + Zt )
σ z24
γ0 = = σ z2 = 1.3333σ z2
0.75 3
Example 5.2.2
Consider the out from R below. Write down the estimated model.
Coefficients:
ar1 mean
0.6909 14.6309
s.e. 0.1094 0.5840
Solution 5.2.2
The model can be written in the form (Yt − µ ) = α (Yt −1 − µ ) + Zt
Yt = 4.5224 + 0.6909Yt −1 + Z t
24