Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
44 views

PSTAT 174/274: Lecture Notes 2

This document provides lecture notes on time series analysis. The key points covered include: 1. It defines the autocovariance and autocorrelation functions for a stationary time series process. 2. It discusses the properties of the autocovariance and autocorrelation functions. 3. It introduces the partial autocorrelation function and defines a white noise process.

Uploaded by

Sam Skywalker
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views

PSTAT 174/274: Lecture Notes 2

This document provides lecture notes on time series analysis. The key points covered include: 1. It defines the autocovariance and autocorrelation functions for a stationary time series process. 2. It discusses the properties of the autocovariance and autocorrelation functions. 3. It introduces the partial autocorrelation function and defines a white noise process.

Uploaded by

Sam Skywalker
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

PSTAT 174/274

NOTES - 2

LECTURE NOTES 2

1
THE AUTOCOVARIANCE AND THE
AUTOCORRELATION FUNCTIONS
• For a stationary process {Yt}, the
autocovariance between Yt and Yt-k is

 k  CovYt , Yt k   EYt   Yt k   

and the autocorrelation function is


k
 k  Corr Yt , Yt k    ACF
0

2
THE AUTOCOVARIANCE AND THE
AUTOCORRELATION FUNCTIONS
PROPERTIES:
1.  0  Var Yt   0  1.
2.  k   0   k  1.
3.  k   k and  k   k , k .
4. (necessary condition) k and k are positive semi-
definite n n
   i j ti t j  0
i 1 j 1
n n
   i j  ti t j  0
i 1 j 1
for any set of time points t1,t2,…,tn and any real numbers 1,2,…,
3
n.
THE PARTIAL AUTOCORRELATION
FUNCTION (PACF)
• PACF is the correlation between Yt and Yt-k after
their mutual linear dependency on the
intervening variables Yt-1, Yt-2, …, Yt-k+1 has
been removed.
• The conditional correlation
Corr Yt , Yt k Yt 1 , Yt 2 ,, Yt k 1   kk
is usually referred as the partial autocorrelation
in time series.
e.g., 11  Corr Yt , Yt 1   1
22  Corr Yt , Yt 2 Yt 1  4
WHITE NOISE (WN) PROCESS
• A process {et} is called a white noise (WN)
process, if it is a sequence of uncorrelated
random variables from a fixed distribution
with constant mean {E(at)=}, constant
 2
variance {Var(at)= a } and Cov(Yt, Yt-k)=0 for
all k≠0.

𝑌𝑡 = 𝑒𝑡

11
WHITE NOISE (WN) PROCESS
• It is a stationary process with autocovariance
function
 a2 , k  0
k  
 0, k  0
ACF PACF
 1, k  0  1, k  0
k   kk  
0, k  0 0, k  0

Basic Phenomenon: ACF=PACF=0, k0.


12
WHITE NOISE (WN) PROCESS
• White noise (in spectral analysis): white light is
produced in which all frequencies (i.e., colors)
are present in equal amount.
• Memoryless process
• Building block from which we can construct
more complicated models
• It plays the role of an orthogonal basis in the
general vector and function analysis.

13
ESTIMATION OF THE MEAN, AUTOCOVARIANCE
AND AUTOCORRELATION
• THE SAMPLE MEAN:
n
 yt
y  t 1
n
0
 n 1  k
with E Y    and Var Y   n  1  k .
k    n 1  n

  n 
Because Var Y  0, Y is a CE for  .
lim Y   in mean square

n 
if this holds, the process is ergodic for the mean.
14
ERGODICITY
• Kolmogorov’s law of large number (LLN) tells that if
Xii.i.d.(μ, 2) for i = 1, . . . , n, then we have the
following limit for the ensemble
n
average
 Yi
Yn  i 1   .
n
• In time series, we have time series average, not
ensemble average. Hence, the mean is computed by
averaging over time. Does the time series average
converges to the same limit as the ensemble
average? The answer is yes, if Yt is stationary and
ergodic.
15
ERGODICITY
• A covariance stationary process is said to be
ergodic for the mean, if the time series
average converges to the population mean.
• Similarly, if the sample average provides a
consistent estimate for the second moment,
then the process is said to be ergodic for the
second moment.

16
ERGODICITY

• A sufficient condition for a covariance


stationary process to be ergodic for the mean

is that   k   . Further, if the process is
k 0
Gaussian, then absolute summable
autocovariances also ensure that the process
is ergodic for all moments.
17
THE SAMPLE AUTOCOVARIANCE
FUNCTION

1 nk
ˆk   Yt  Y Yt k  Y 
n t 1
or
1 nk
ˆk   Yt  Y Yt k  Y 
n  k t 1

18
THE SAMPLE AUTOCORRELATION
FUNCTION
nk
 Yt  Y Yt k  Y 
ˆ k  rk  t 1 , k  0,1,2,...
n
 Yt  Y 
2
t 1
• A plot ̂k versus k a sample correlogram
• For large sample sizes, ̂k is normally
distributed with mean k and variance is
approximated by Bartlett’s approximation for
processes in which k=0 for k>m.
19
THE SAMPLE AUTOCORRELATION
FUNCTION
1

Var  ˆ k   1  2 12  2  22    2  m2
n

• In practice, i’s are unknown and replaced by
their sample estimates,̂i. Hence, we have the
following large-lag standard error of ̂k :

sˆ k 
1
n

1  2 ˆ 12  2 ˆ 22    2 ˆ m
2

20
THE SAMPLE AUTOCORRELATION
FUNCTION
• For a WN process, we have
1
sˆ k 
n
• The ~95% confidence interval for k:
1
ˆ k  2
n
For a WN process, it must be close to zero.

• Hence, to test the process is WN or not, draw a


2/n1/2 lines on the sample correlogram. If all ̂k
are inside the limits, the process could be WN
(we need to check the sample PACF, too). 21
BACKSHIFT (OR LAG) OPERATORS
• Backshift operator, B is defined as
B jYt  Yt  j , j  0 with B0  1.
BYt  Yt 1
B 2Yt  Yt  2
B12Yt  Yt 12
e.g. Random Shock
WALK Process:
Yt  Yt 1  et
Yt  Yt 1  et
Yt  BYt  et
1  B Yt  et 23
MOVING AVERAGE REPRESENTATION
OF A TIME SERIES
• Also known as Random Shock Form or Wold
(1938) Representation.
• Let {Yt} be a time series. For a stationary
process {Yt}, we can write {Yt} as a linear
combination of sequence of uncorrelated
(WN) r.v.s.
A GENERAL LINEAR PROCESS: 
Yt    at  1at 1  2 at  2        j at  j
j 0

where 0=1, {at} is a 0 mean WN process and  j  .
 2
j 0
24
MOVING AVERAGE REPRESENTATION
OF A TIME SERIES


Yt    at  1Bat  2 B at        j B j at
2
j 0

 
   1  1B  2 B 2   at

  
     B at where   B   1  1B  2 B      j B j
2
j 0

25
EXAMPLE
 
Yt  Yt 1  at where   1 and at ~ iid 0, a2 .

a) Write the above equation in random shock


form.

30
AUTOREGRESSIVE REPRESENTATION
OF A TIME SERIES
• This representation is also known as INVERTED
FORM.
• Regress the value of Yt at time t on its own
past plus a random shock.
Yt     1Yt 1      2 Yt 2       at

1  1B   2 B 2  Yt     at

B 

  j

   j B  Yt     at with  0  1 and 1    j  .
 j 0  j 1
31
INVERTIBILITY OF A RANDOM SHOCK
FORM
• For a Random shock form,
Yt    B at
to be invertible, the roots of (B)=0 as a
function of B must lie outside the unit circle.
• If  is a root of (B), then ||>1.
(real number) || is the absolute value of .
(complex number)  c  id  || is   c 2
 d 2
.

33
STATIONARITY OF AN INVERTIBLE
FORM
• An invertible form can be stationary if the
process can be re-written in a RSF, i.e.

Π(𝐵)(𝑌𝑡 − 𝜇) = 𝑎𝑡
1
Yt     at    B at
B 

  B  B   1 where   2j  .
j 0

34
RANDOM SHOCK FORM AND
INVERTED FORM
• AR and MA representations are not the model
form. Because they contain infinite number of
parameters that are impossible to estimate
from a finite number of observations.

35
TIME SERIES MODELS
• In the Inverted Form of a process, if only finite
number of  weights are non-zero, i.e.,

1  1 ,  2  2 ,,  p   p and Πk  0, k  p,
the process is called AR(p) process.

36
TIME SERIES MODELS

• In the Random Shock Form of a process, if only


finite number of  weights are non-zero, i.e.,

Ψ1 = 𝜃1 , Ψ2 = 𝜃2 , … , Ψ𝑞 = 𝜃𝑞 𝑎𝑛𝑑 Ψ𝑘 = 0, 𝑘 > 𝑞

the process is called MA(q) process.

37
TIME SERIES MODELS
• AR(p) Process:
Yt     1Yt 1        p Yt  p     at
c
Yt  c  1Yt 1     pYt  p  at where  
1-1     p
.

• MA(q) Process:

𝑌𝑡 = 𝜇 + 𝑎𝑡 + 𝜃1 𝑎𝑡−1 + … + 𝜃𝑞 𝑎𝑡−𝑞

38
TIME SERIES MODELS

• For a fixed number of observations, the more


parameters in a model, the less efficient is the
estimation of the parameters. Choose a simpler
model to describe the phenomenon.

39

You might also like