0% found this document useful (0 votes)
56 views

Stat 565: Some Basic Time Series Models

This document provides an overview of some basic time series models including: - White noise, which has a mean of 0 and constant variance over time. - Random walk with drift, where the next value depends on the previous value plus drift plus noise. - MA(1) model where the current value depends on previous noise plus current noise. - AR(1) model where the current value depends on previous value plus current noise. It discusses properties like stationarity and autocorrelation functions of these models. Higher order AR and MA models and combined ARMA models are introduced.

Uploaded by

masudul9islam
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
56 views

Stat 565: Some Basic Time Series Models

This document provides an overview of some basic time series models including: - White noise, which has a mean of 0 and constant variance over time. - Random walk with drift, where the next value depends on the previous value plus drift plus noise. - MA(1) model where the current value depends on previous noise plus current noise. - AR(1) model where the current value depends on previous value plus current noise. It discusses properties like stationarity and autocorrelation functions of these models. Higher order AR and MA models and combined ARMA models are introduced.

Uploaded by

masudul9islam
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

Stat 565

Some Basic Time Series Models


Jan 19 2016

Charlotte Wickham stat565.cwick.co.nz


Weak Stationarity

A time series {xt} is weakly stationary if


it's mean function doesn't depend on
time, and it's autocovariance function
only depends on the distance between
the two time points,
t = E[ xt ] =
Often rewrite as
(s, t) = Cov(xs, xt) = (t - s) (h) = Cov(xt, xt+h)

xt assumed to have finite variance


Autocorrelation

For a stationary process the


autocorrelation is:
Cor(xt, xt+h) = (h) = (h) / (0)
Some basic models

White noise
Random walk with drift
Moving average of order 1 MA(1)
Autoregressive of order 1 AR(1)

What is the mean function?


What is the autocovariance function?
Is the process weakly stationary?
White noise
{ wt } is a white noise process if wt are
uncorrelated identically distributed
random variables with
E[wt] = 0 and Var[wt] = ,
2 for all t
If the wt are Normally (Gaussian)
distributed, the series is known as
Gaussian white noise.
White noise

Simulated
2 =1
White noise
What is the mean function?
t =E[wt] = 0
What is the autocovariance function?
2
(h) = { , h = 1

{ 0, otherwise
Is white noise stationary?
Yes.
Random walk with drift
drift, a constant
xt = + xt-1 + wt
where {wt} is a white noise process, and
x0 = 0.

Can rewrite as:


t
xt = t + j=1 wj
Random walk (drift = 0)

Simulated
Random walk (drift = 0.1)

Simulated
Your turn
Random walk with drift xt = t + t
j=1 wj

What is the mean function?


t = E[wt] = ?
What is the autocovariance function?
(t, t+h) = Cov(wt, wt+h)?

Is the random walk model stationary?


Moving average MA(1)

xt = 1wt-1 + wt
where {wt} is a white noise process.

We'll see higher order MA processes later...


MA(1) 1= 1

Simulated
Your turn
MA(1) xt = 1wt-1 + wt
What is the mean function?

What is the autocovariance function?

Is MA(1) stationary?
MA(1) 1= 1

ACF for simulated data


Autoregressive AR(1)

xt = 1xt-1 + wt
where {wt} is a white noise process.

We'll see higher order AR processes later...


AR(1) 1= 0.9

Simulated
AR(1) 1= 0.5

Simulated
AR(1)

What is the mean function?

What is the autocovariance function?

Is AR(1) stationary?
AR(1) 1= 0.9

ACF for simulated data


Three stationary models

White noise MA(1), any 1 AR(1), |1| < 1

(h) = 1, when h = 0 (h) = 1, when h = 0 (h) = 1, when h = 0


= 0, otherwise = 1/(1 + 12), h = 1 = 1h, h > 0
= 0, h 2

Only lag 0 and 1


Only lag 0 shows
show non-zero Decreasing ACF
non-zero ACF.
ACF.
1
Which models might these
simulated data come from?

2 3

4 5
A General Linear Process
A linear process xt is defined to be a linear
combination of white noise variates, Zt,
1
X
xt = i Zt i
i=0

with
1
X
| i| < 1
i=0
This is enough to
ensure stationarity
Autocovariance

One can show that the autocovariance


of a linear process is,
1
X
2
(h) = i+h i
i=0
Your turn
Write the MA(1) and AR(1) processes in
the form of linear processes.
I.e. what are the j?
1
X
xt = i Zt i
i=0
Verify the autocovariance functions for
MA(1) and AR(1)
1
X
2
(h) = i+h i
i=0

MA(1) we did
AR(1) you do
Backshift Operator
The backshift operator, B, is defined as
Bxt = xt-1
It can be extended to powers in the
obvious way:
2
B xt = (BB)xt = B(Bxt) = Bxt-1 = xt-2
k
So, B xt = xt-k
MA(1): xt = 1Zt-1 + Zt
AR(1): xt = 1xt-1 + Zt
Your turn

Write the MA(1) and AR(1) models using


the backshift operator.
Difference Operator
The dierence operator, , is defined as,
d d
xt = ( 1 - B) xt
1
(e.g. xt = ( 1 - B) xt = xt - xt-1)

d
(1-B) can be expanded in the usual way,
2 2
e.g. (1 - B) = (1 - B)(1 -B) = 1 - 2B + B

Some non-stationary series can be made


stationary by differencing, see HW#2.
Roadmap
Extend AR(1) to AR(p) and MA(1) to MA(q)
Combine them to form ARMA(p, q)
processes
Discover a few hiccups, and resolve them.
Then find the ACF (and PACF) functions for
ARMA(p, q) processes.
Figure out how to fit a ARMA(p,q) process to
real data.

You might also like