0% found this document useful (0 votes)
2 views

STA457 Week 4 Notes

This lecture covers time series analysis focusing on Autoregressive (AR), Moving Average (MA), and Autoregressive Moving Average (ARMA) models. Key topics include definitions, properties, and causality of these models, with specific emphasis on the AR(p) and MA(q) models. The session also discusses stationarity, explosiveness, and the backshift operator in relation to these time series models.

Uploaded by

easyacemt
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

STA457 Week 4 Notes

This lecture covers time series analysis focusing on Autoregressive (AR), Moving Average (MA), and Autoregressive Moving Average (ARMA) models. Key topics include definitions, properties, and causality of these models, with specific emphasis on the AR(p) and MA(q) models. The session also discusses stationarity, explosiveness, and the backshift operator in relation to these time series models.

Uploaded by

easyacemt
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 54

STA457: Time Series Analysis

Lecture 6

Lijia Wang

Department of Statistical Sciences


University of Toronto

Lijia Wang (UofT) STA457: Time Series Analysis 1 / 47


Overview

Last Time:
1 Definitions of stationary
2 Estimation of correlation
3 Large sample properties of sample statistics
Today:
1 Autoregressive (AR) process
2 Moving average (MA) process
3 Autoregressive moving average (ARMA)

Lijia Wang (UofT) STA457: Time Series Analysis 2 / 47


Outline

1 Autoregressive (AR) process


AR(p) Model
Backshift Operator
Explosiveness and causality

2 Moving Average Models


MA(q) model
Invertibility and causality

3 Autoregressive Moving Average Models


ARMA(p,q)
Causality of a ARMA(p,q) process

Lijia Wang (UofT) STA457: Time Series Analysis 3 / 47


Contents

Over the next few weeks, we will learn the following models:

1 Autoregressive (AR)
2 Moving average (MA)
3 Autoregressive moving average (ARMA)
4 Autoregressive integrated moving average (ARIMA)

Lijia Wang (UofT) STA457: Time Series Analysis 4 / 47


Introduction to Autoregressive Models

Autoregressive models are based on the idea that the current value of the
series, xt , can be explained as a function of p past values,
xt→1 , xt→2 , · · · , xt→p , where p determines the number of steps into the
past needed to forecast the current value.

For an example,
xt = xt→1 → 0.9xt→2 + wt
where wt ↑ N(0, 1). This is the autoregressive model with order 2.

Lijia Wang (UofT) STA457: Time Series Analysis 5 / 47


AR(p) Model

Definition: An autoregressive model of order p, abbreviated AR(p), is of


the form
xt = ω1 xt→1 + ω2 xt→2 + · · · + ωp xt→p + wt
! 2
"
where xt is stationary, wt ↑ wn 0, εw and ω1 , ω2 , · · · , ωp are constants
(ωp ↓= 0). We have E (xt ) = 0.

Lijia Wang (UofT) STA457: Time Series Analysis 6 / 47


AR(p) Model with nonzero mean

Definition: If the mean, µ, of xt is not zero, replace xt by xt → µ. We


can get

xt → µ = ω1 (xt→1 → µ) + ω2 (xt→2 → µ) + · · · + ωp (xt→p → µ) + wt ,

or
xt = ϑ + ω1 xt→1 + ω2 xt→2 + · · · + ωp xt→p + wt ,
where
ϑ = µ (1 → ω1 → ω2 → · · · → ωp ) .

Lijia Wang (UofT) STA457: Time Series Analysis 7 / 47


Backshift Operator

We define the backshift operator by

Bxt = xt→1
and extend it to powers

B 2 xt = B (Bxt ) = Bxt→1 = xt→2 ,

and so on. Thus,

B k xt = xt→k

Lijia Wang (UofT) STA457: Time Series Analysis 8 / 47


AR(p) Model Backshift Operator

Using the backshift operator, we can write AR (p) Model as


! 2 p
"
1 → ω 1 B → ω 2 B → · · · → ω p B xt = w t
↔ ω(B)xt = wt ,

where ! "
2 p
ω(B) = 1 → ω1 B → ω2 B → · · · → ωp B .

Lijia Wang (UofT) STA457: Time Series Analysis 9 / 47


Example 1: AR(1)

Example 1: Let xt = ωxt→1 + wt is AR(1) process, where |ω| < 1. Show


that
#↑ j
1 x =
t j=0 ω wt→j . That is, xt is a linear process.
2 the autocovariance function
εw2 ωh
ϖ(h) = 2
;h ↗ 0
1→ω
3 the autocorrelation function

ϱ(h) = ωh ; h ↗ 0.

4 xt is stationary.

Lijia Wang (UofT) STA457: Time Series Analysis 10 / 47


More on the AR(1) Example

For any AR(1) process, xt = ωxt→1 + wt is , where |ω| < 1, one can show
that,

$
xt = ωj wt→j = ς(B)wt .
j=0

1 This is called the stationary solution of the AR(1) model


2 It expresses xt as convergent infinite summations of shocks (wi ’s),
more specifically past shocks.
3 It also express xt as a linear process.

Lijia Wang (UofT) STA457: Time Series Analysis 11 / 47


Linear Process

Definition: A linear process, xt , is defined to be a linear combination of


white noise variates wt , and is given by

$ ↑
$
xt = µ + ςj wt→j , |ςj | < ↘
j=→↑ j=→↑

For the linear process, we may show that the autocovariance function is
given by

$
ϖx (h) = εw2 ςj+h ςj ,
j=→↑

for h > 0.

Lijia Wang (UofT) STA457: Time Series Analysis 12 / 47


Stationarity

The stationary solution of a time series is the steady-state (equilibrium) of


the original process {xt } that:
is consistent with the original stochastic equation governing the
process;
exhibits time-invariant statistical properties (mean, variance, and
autocovariance);
is independent of initial conditions (e.g., x0 ).

An intuitive understanding of “stationarity”: After a su!ciently long time,


the influence of the initial state of the series becomes negligible, and the
series reaches a “stable” status, where the statistical properties of the
process (like mean, variance, and autocovariance) do not change over time.

Lijia Wang (UofT) STA457: Time Series Analysis 13 / 47


Example 2: AR simulation

Example 2: Generate AR (1) process for ω = 0.9 and ω = →0.9.


1 Plot the generated observations.
2 Draw ACF and partial ACF graphs for the generated series

Lijia Wang (UofT) STA457: Time Series Analysis 14 / 47


Example 2: AR simulation

Lijia Wang (UofT) STA457: Time Series Analysis 15 / 47


Example 2: AR simulation

Lijia Wang (UofT) STA457: Time Series Analysis 16 / 47


Explosive AR Models and Causality

! "
Consider the random walk model xt = xt→1 + wt , where wt ↑ wn 0, εw2 .
1 Show that the autocovariance function

ϖx (s, t) = min{s, t}εw2 .

2 xt is not stationary.
3 Consider AR (1) process xt = ωxt→1 + wt with |ω| > 1. Such
processes are called explosive because the values of the time series
quickly become large in magnitude.

Lijia Wang (UofT) STA457: Time Series Analysis 17 / 47


AR Models: Explosive and Causality

For the explosive AR (1) process xt = ωxt→1 + wt with |ω| > 1, we may
re-write it as
1 1
xt = xt+1 → wt+1 .
ω ω
Because |ω|→1 < 1, this result suggests the stationary future-dependent
AR(1) model
$↑
xt = → ω→1 wt+j
j=1

When a process does not depend on the future, such as the AR(1)
when |ω| < 1, we will say the process is causal.
In the explosive case of this example, the process is stationary, but it
is also future dependent, therefore not causal.

Lijia Wang (UofT) STA457: Time Series Analysis 18 / 47


Causality

Definition: A time series xt is causal if it can be written as a convergent


infinite series of past shocks (or innovations) in the following form:

$
xt = ςj wt→j = ς(B)wt ,
j=0
#↑
where j=0 |ςj | < ↘.

An intuitive understanding of “causality”: The steady status of the series


can be reached from the past values (shocks).

Lijia Wang (UofT) STA457: Time Series Analysis 19 / 47


Summary on stationarity and causality of AR(1) model

For an AR(1) series xt = ωxt→1 + wt , we have:


If |ω| < 1, the series is stationary and causal;
If |ω| = 1, the series becomes a random-walk series and is not
stationary;
If |ω| > 1, the series is stationary but not causal.

Lijia Wang (UofT) STA457: Time Series Analysis 20 / 47


Stationary Solution of AR(p)

The technique of iterating backward works well for AR(1), but not for
larger p. A general technique is matching coe!cients. Since we write
AR(p) of the form
ω(B)xt = wt ,
the stationary solution usually has the form

$
xt = ςj wt→j = ς(B)wt ,
j=0

therefore,
ω(B)ς(B)wt = wt ,
matching the coe!cients yields the solution.

Lijia Wang (UofT) STA457: Time Series Analysis 21 / 47


Stationary Solution of AR(p)

Another way to think about the operations:

xt = ς(B)wt = ω→1 (B)wt .

Since

ς(B) = ω→1 (B) = 1 + ωB + ω2 B 2 + · · · + ωj B j + . . . ,

treating the backshift operator as a complex number z and solving the


polynomial yields

→1 1
ω (z) = = 1 + ωz + ω2 z 2 + · · · + ωj z j + . . . , |z| ≃ 1.
(1 → ωz)

Lijia Wang (UofT) STA457: Time Series Analysis 22 / 47


Outline

1 Autoregressive (AR) process


AR(p) Model
Backshift Operator
Explosiveness and causality

2 Moving Average Models


MA(q) model
Invertibility and causality

3 Autoregressive Moving Average Models


ARMA(p,q)
Causality of a ARMA(p,q) process

Lijia Wang (UofT) STA457: Time Series Analysis 23 / 47


Moving Average Models

Moving average models are based on the idea that the current value of the
series, xt , is the moving average of q past steps of white noise,
wt→1 , wt→2 , . . . , wt→q , where q determines the number of steps into the
past.

AR(p) model: {xt } on the left-hand side of the defining equation are
assumed to be combined linearly;
MA(q) model: {wt } on the right-hand side of the defining equation
are combined linearly.

Lijia Wang (UofT) STA457: Time Series Analysis 24 / 47


MA(q) model

Definition: The moving average model of order q, or MA(q) model, is


defined to be

xt = wt + φ1 wt→1 + φ2 wt→2 + · · · + φq wt→q ,


! 2
"
where wt ↑ wn 0, εw , and φ1 , φ2 , · · · , φq (φq ↓= 0) are parameters.

Lijia Wang (UofT) STA457: Time Series Analysis 25 / 47


MA(q) Model Backshift Operator

The MA(q) model can be written by

xt = φ(B)wt

where φ(B) = 1 + φ1 B + φ2 B 2 + · · · + φq B q that is called the moving


average operator.

Lijia Wang (UofT) STA457: Time Series Analysis 26 / 47


Example 3: MA(1)

Example 3: Consider the MA(1) model xt = wt + φwt→1 . Find


E (xt )
the autocovariance function
the autocorrelation function

Lijia Wang (UofT) STA457: Time Series Analysis 27 / 47


Example 4: MA simulation

Generate MA(1) model for φ = 0.5 and φ = →0.5.


1 Plot the generated observations
2 Draw ACF and partial ACF graphs for the generated series

Lijia Wang (UofT) STA457: Time Series Analysis 28 / 47


Example 4: MA simulation

Lijia Wang (UofT) STA457: Time Series Analysis 29 / 47


MA models: non-uniqueness

For the MA(1) model, xt = wt + φwt→1 , notice that the following models
share the same ACFs:
1
φ= 5 and εw = 5, i.e.

1 i.i.d
xt = wt + wt→1 , wt ↑ N(0, 25)
5
φ = 5 and εw = 1, i.e.
i.i.d
yt = vt + 5vt→1 , vt ↑ N(0, 1)

By mimicking the criterion of causality for AR models, we will choose the


model with an infinite AR representation. Such a process is called an
invertible process.

Lijia Wang (UofT) STA457: Time Series Analysis 30 / 47


MA models: invertibility

To discover which model is the invertible model, we can reverse the roles
of xt and wt (mimicking the AR case),

wt = →φwt→1 + xt .

If |φ| < 1, the MA(1) as wt = →φwt→1 + xt can be written by



$
wt = (→φ)j xt→j = ↼(B)xt ,
j=0

which is the infinite AR representation of the model. Such a process is


called an invertible process.

Lijia Wang (UofT) STA457: Time Series Analysis 31 / 47


Polynomials of MA(q)

Similar to the AR(p) process, we write MA(q) of the form

xt = φ(B)wt ,

the equivalent expression has the form

↼(B)xt = wt ,

we may treat the backshift operator as a complex number z and solving


for the polynomial ↼(z) = φ→1 (z).

Lijia Wang (UofT) STA457: Time Series Analysis 32 / 47


Outline

1 Autoregressive (AR) process


AR(p) Model
Backshift Operator
Explosiveness and causality

2 Moving Average Models


MA(q) model
Invertibility and causality

3 Autoregressive Moving Average Models


ARMA(p,q)
Causality of a ARMA(p,q) process

Lijia Wang (UofT) STA457: Time Series Analysis 33 / 47


ARMA model

We now proceed with the general development of autoregressive, moving


average, and mixed autoregressive moving average (ARMA), models
for stationary time series.

Lijia Wang (UofT) STA457: Time Series Analysis 34 / 47


ARMA(p,q)

Definition: A time series {xt ; t = 0, ±1, ±2, · · · } is ARMA(p, q) if it is


stationary and

xt = ω1 xt→1 + ω2 xt→2 + · · · + ωp xt→p + wt + φ1 wt→1 + φ2 wt→2 + · · · + φq wt→q

with ωp ↓= 0, φq ↓= 0, εw2 > 0. The parameters p and q are called the


autoregressive and the moving average orders, respectively.

The ARMA(p, q) model can then be written in concise form as

ω(B)xt = φ(B)wt .

Lijia Wang (UofT) STA457: Time Series Analysis 35 / 47


ARMA(p,q) nonzero mean

If xt has a nonzero mean µ, we set ϑ = µ (1 → ω1 → ω2 → · · · → ωp ) and


write the model as

xt = ϑ+ω1 xt→1 +ω2 xt→2 +· · ·+ωp xt→p +wt +φ1 wt→1 +φ2 wt→2 +· · ·+φq wt→q ,
! 2
"
where wt ↑ wn 0, εw .

Lijia Wang (UofT) STA457: Time Series Analysis 36 / 47


Parameter Redundancy

Consider a white noise process xt = wt . If we multiply both sides of the


equation by ↽(B) = (1 → 0.5B), then the model becomes

(1 → 0.5B)xt = (1 → 0.5B)wt

or
xt = 0.5xt→1 → 0.5wt→1 + wt
.

This model looks like an ARMA(1,1) model.


In fact, xt is white noise because of the parameter redundancy or
over-parameterization.

Lijia Wang (UofT) STA457: Time Series Analysis 37 / 47


Problems encountered

To summarize, we have seen the following problems:

(i) parameter redundant models,


(ii) stationary AR models that depend on the future, and
(iii) MA models that are not unique.

Lijia Wang (UofT) STA457: Time Series Analysis 38 / 47


polynomials

Definitions: The AR and MA polynomials are defined as


! 2 p
"
AR: ω(z) = 1 → ω1 z → ω2 z → · · · → ωp z , ωp ↓= 0, and
MA: φ(z) = 1 + φ1 z + φ2 z 2 + · · · + φq z q , φq ↓= 0,

respectively, where z is a complex number.

Lijia Wang (UofT) STA457: Time Series Analysis 39 / 47


Problems encountered - Solutions

To summarize, we have seen the following problems:

(i) parameter redundant models,


Solution: we require that ω(z) and φ(z) have no common factors.
(ii) stationary AR models that depend on the future, and
(iii) MA models that are not unique.

Lijia Wang (UofT) STA457: Time Series Analysis 40 / 47


Problems encountered - Solutions

To summarize, we have seen the following problems:

(i) parameter redundant models,


Solution: we require that ω(z) and φ(z) have no common factors.
(ii) stationary AR models that depend on the future, and
Solution: A formal definition of causality for ARMA models.
(iii) MA models that are not unique.

Lijia Wang (UofT) STA457: Time Series Analysis 41 / 47


Definition for causal

Definition: An ARMA(p, q) model is said to be causal, if the time series


{xt ; t = 0, ±1, ±2, · · · } can be written as a one-sided linear process:

$
xt = ςj wt→j = ς(B)wt ,
j=0

where

$ ↑
$
ς(B) = ςj B j , and |ςj | < ↘; we set ς0 = 1.
j=0 j=0

Lijia Wang (UofT) STA457: Time Series Analysis 42 / 47


Causality of ARMA(p,q) process

An ARMA(p, q) model is causal if and only if ω(z) ↓= 0 for |z| ≃ 1. The


coe!cients ςj ’s of the linear process can be determined by solving

$
j φ(z)
ς(z) = ςj z = , |z| ≃ 1
ω(z)
j=0

Remark: An ARMA process is causal only when the roots of ω(z) lie
outside the unit circle; that is, ω(z) = 0 only when |z| > 1.

Lijia Wang (UofT) STA457: Time Series Analysis 43 / 47


Problems encountered - Solutions

To summarize, we have seen the following problems:

(i) parameter redundant models,


Solution: we require that ω(z) and φ(z) have no common factors.
(ii) stationary AR models that depend on the future, and
Solution: A formal definition of causality for ARMA models.
(iii) MA models that are not unique. Solution: The formal definition of
invertible property allows an infinite autoregressive representation.

Lijia Wang (UofT) STA457: Time Series Analysis 44 / 47


Definition for invertible

Definition: An ARMA(p, q) model is said to be invertible, if the time


series {xt ; t = 0, ±1, ±2, · · · } can be written as

$
↼(B)xt = ↼j xt→j = wt
j=0
#↑ #↑
where ↼(B) = j=0 ↼j Bj , and j=0 |↼j | < ↘; we set ↼0 = 1. ↼(B)

Lijia Wang (UofT) STA457: Time Series Analysis 45 / 47


ARMA(p,q): Invertible

An ARMA(p, q) model is invertible if and only φ(z) ↓= 0 for |z| ≃ 1. The


coe!cients ↼j S of ↼(B) can be determined by solving

$
j ω(z)
↼(z) = ↼j z = , |z| ≃ 1
φ(z)
j=0

Remark: An ARMA process is invertible only when the roots of φ(z) lie
outside the unit circle; that is, φ(z) = 0 only when |z| > 1.

Lijia Wang (UofT) STA457: Time Series Analysis 46 / 47


Example: ARMA(p,q)

Example 5: Consider the following time series model

xt = 0.4xt→1 + 0.45xt→2 + wt + wt→1 + 0.25wt→2 (1)

1 Identify the above model as ARMA(p, q) model (watch out parameter


redundancy)
2 Determine whether the model is causal and/or invertible
3 If the model is causal, write the model as a linear process

Lijia Wang (UofT) STA457: Time Series Analysis 47 / 47

You might also like