STA222
STA222
STA222
Course Outline
i. An introduction to time series in time domain and spectral domain.
ii. Estimation of trends and seasonal effects.
iii. Autoregressive moving average models.
iv. Forecasting.
v. Indicators
vi. Harmonic analysis
vii. Spectra.
The methods of time series analysis pre-date those for general stochastic processes and Markov Chains. The aims
of time series analysis are to describe and summarise time series data, fit low-dimensional models, and make
forecasts.
We write our real-valued series of observations as ...,X−2,X−1,X0,X1,X2,..., a doubly infinite sequence of real-
valued random variables indexed by Z.
1
Seasonal effects (It) — cyclical fluctuations related to the calendar; Cycles (Ct) — other cyclical
fluctuations (such as a business cycles); Residuals (Et) — other random or systematic fluctuations.
The idea is to create separate models for these four elements and then combine them, either additively
Xt = Tt + It + Ct + Et
or multiplicatively
Xt = Tt · It · Ct · Et .
(Xt1,...,Xtk)=(D Xt1+h,...,Xtk+h)
Remarks.
2. If the process is Gaussian, that is (Xt1,...,Xtk) is multivariate normal, for all t1,...,tk, then weak stationarity implies
strong stationarity.
Xt = XφrXt−r + ǫt (1.1)
2
r=1
where φ1,...,φr are fixed constants and {ǫt} is a sequence of independent (or uncorrelated) random variables with
Xt = φ1Xt−1 + ǫt . (1.2)
The fact that {Xt} is second order stationary follows from the observation that E(Xt) = 0 and that the
There is an easier way to obtain these results. Multiply equation (1.2) by Xt−k and take the expected value, to
give
and so γ0 = σ2/
(1 − φ2 ).
1
Again, the autocorrelationfunction can be found by multiplying (1.3) by Xt−k, taking the expected value and
dividing by γ0, thus producing the Yule-Walker equations
These are linear recurrence relations, with general solution of the form
3
,
where ω1,...,ωp are the roots of
and C1,...,Cp are determined by ρ0 = 1 and the equations for k = 1,...,p − 1. It is natural to require γk → 0 as k → ∞, in
which case the roots must lie inside the unit circle, that is, |ωi| < 1. Thus there is a restriction on the values of
Xt = Xθsǫt−s (1.4)
s=0
where θ1,...,θq are fixed constants, θ0 = 1, and {ǫt} is a sequence of independent (or uncorrelated) random variables
It is clear from the definition that this is second order stationary and that
4
We remark that two moving average processes can have the same autocorrelation function. For
example,
both have ρ1 = θ/(1 + θ2), ρk = 0, |k| > 1. However, the first gives ǫt = Xt − θǫt−1 = Xt −
This is only valid for |θ| < 1, a so-called invertible process. No two invertible processes have the
However, a very simple diagnostic is the turning point test, which examines a series {Xt} to
test whether it is purely random. The idea is that if {Xt} is purely random then three successive
values are equally likely to occur in any of the six possible orders.
In four cases there is a turning point in the middle. Thus in a series of n points we might expect
(2/3)(n − 2) turning points.
In fact, it can be shown that for large n, the number of turning points should be distributed as
about N(2n/3,8n/45). We reject (at the 5% level) the hypothesis that the series is unsystematic if the
number of turning points lies outside the range
5
6