0% found this document useful (0 votes)
26 views

Lecture 4

This document provides an overview of univariate time series modeling and forecasting using ARIMA models. It discusses key concepts such as stationarity, autoregressive (AR) processes, moving average (MA) processes, and autoregressive integrated moving average (ARIMA) processes. The document also describes the Box-Jenkins methodology, a four step approach for identifying, estimating, diagnosing, and forecasting with ARIMA models.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views

Lecture 4

This document provides an overview of univariate time series modeling and forecasting using ARIMA models. It discusses key concepts such as stationarity, autoregressive (AR) processes, moving average (MA) processes, and autoregressive integrated moving average (ARIMA) processes. The document also describes the Box-Jenkins methodology, a four step approach for identifying, estimating, diagnosing, and forecasting with ARIMA models.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 21

UNIVARIATE TIME SERIES

MODELING AND FORECASTING


Introduction
• The focus of the chapter we aim to exploit
information of a variable from itself.
• An analysis of a single time series is called a
univariate time series
• The purpose of time series analysis is to
capture and examine the dynamics of the
data.
Introduction Cont.
• There are various aspects to time series
analysis.
• The common theme to all is to fully exploit the
dynamics or the temporal structure of the
data.
• This refers to extracting as much information
as possible from the past history of the series.
• The two principles types of time series analysis are:
• Time series forecasting
• Dynamic modelling
• Time series is concerned with building efficient models,
which forecast well.
• This is done by exploiting the dynamic inter-relationship
which exists over time for any single variable.
• Dynamic modelling is concerned with understanding the
structure of the economy and testing hypothesis.
• The basic ‘work horse’ of time series forecasting is
called the ARIMA model
• This is the focus of this section.
ARIMA MODELS
• Box and Jenkins introduced ARIMA models,
the term deriving from:
• AR = autoregressive
• I = integrated
• MA = Moving Average
• These models will be discussed in full
Stationarity
• A key concept underlying time series processes is that of
stationarity
• A time series is covariance stationary when it has the
following three characteristics:
• Exhibits mean reversion in that it fluctuates around a
constant long run mean
• Has finite variance that is time variant (Constant Variance)
• Has a theoretical correlogram that diminishes as the lag
length increases (Cov (Yt, Yt+k) = constant for all t and all k≠0
• These quantities will remain the same whether observations
are from 1980 – 1990 or 1994 to 2013
Stationarity Cont.
• Stationarity is important because if the series is
non-stationary, all typical results of the classical
regression analysis are not valid.
• Regressions with non-stationary series may not
have a meaning (spurious).
• Shocks to a stationary time series are temporal,
• over time the effects of the shock will dissipate
and the series will revert to its long-run mean
level.
Autoregressive Models
• A time series process is stationary if the
distribution of the variable does not depend
upon time.
• Assuming that a time series is stationary, there
are a number of ways in which it can be modeled.
• An autoregressive (AR) process. Here the value
of y at time t depends on its value in the previous
period and a random term, where the y values
are expressed as deviations from their mean
value.
An autoregressive (AR) process Cont.
• For example, a GDP series could be modeled as:

Yt  Yt 1  u t
• For simplicity we do not include an error term

• Here l l < 1 and ut is an uncorrelated random error term with a zero mean and
constant variance σ2 (white noise).
• The assumption behind the AR(1) model is that the time series behavior of Y t is
largely determined by its past value in the preceding period.
• We say in this case that Yt follows a first order autoregressive or AR(1) stochastic
process.
• The forecast value of y at time t is simply some proportion of its value at time (t-1)
plus a random shock or disturbance at time t, with the y-values expressed as
deviations around their mean values.
An autoregressive (AR) process Cont.
• An AR process can be generalized so that y depends on more
than one of its previous period values:
Yt  Yt 1  ...................................Yt  p u t
• The current value of y can thus follow a second order
autoregressive, or AR(2) process in which the value of y at time
t depends on its value in the previous two time periods.
• More generally, this can be extended to a pth‑order
autoregressive, or AR(p) process.
• Note that only the current and previous values of y are used;
there are no other regressors. It can be said in this sense that
the ‘data speak or themselves’.
A moving average (MA) process.
• The AR process described above may not be
the only mechanism that may have generated
y.
• The series may follow a moving average, or
MA process where y at time t­ is given by a
constant plus a moving average of the current
and past error terms.
• A first-order moving average, or MA(1)
process, would be modeled as follows:
A moving average (MA) process Cont.

• where μ is a constant and u is, as before, the


white noise stochastic error term.
• An MA process may be generalized so that the
current value of y is given by a constant and a
moving average of more than one of the past
error terms:
A moving average (MA) process Cont.
• The current value of y can thus follow a second-
order moving average, or MA(2) process in
which the value of y at time t depends on a
constant and a moving average of the two
previous error terms.
• More generally, this can be extended to a
qth‑order moving average, or MA(q) process.
• In short, an MA process is simply a linear
combination of white noise error terms.
An Autoregressive and Moving Average
(ARMA) process.
• It is quite likely that y has characteristics of both
AR and MA, and is therefore ARMA.
• An ARMA(1,1) process would be written as:

• where θ represents the constant term.


• This is an ARMA(1,1) process since there is only
one autoregressive and one moving average
term, but this can be generalized to an
ARMA(p,q) process with p autoregressive and q
moving average terms.
An Autoregressive Integrated Moving
Average (ARIMA) process
• The series that we have discussed above has been
based on the assumption that the time series are
weakly stationary in the sense of having a constant
mean and variance and a time invariant covariance.
• Most economic time series are, however,
nonstationary or integrated.
• We will be dealing with this in more detail later,
but if a time series is integrated of order 1, or I(1),
its first differences will be I(0) or stationary.
• More generally, if a time series is I(d) we will obtain
an I(0) series after differencing d times.
An Autoregressive Integrated Moving
Average (ARIMA) process Cont.
• If a series that has to be differenced d times to
make it stationary before applying an
ARMA(p,q) model to it, the original series
would be described as an ARIMA(p,d,q).
• This is a general expression, and any zero
values of the bracketed terms will indicate a
more specific type of series; for example, an
ARIMA(p,0,0) means a purely AR(p) process.
An Autoregressive Integrated Moving
Average (ARIMA) process Cont.
• Use of the Box-Jenkins (BJ) methodology
(covered below) requires that either the
original series is non stationary or that it can
be made stationary after one or more
differencings.
• The objective of BJ is to identify and estimate
a statistical model which can be interpreted as
having generated the sample data.
An Autoregressive Integrated Moving
Average (ARIMA) process Cont.
• If this estimated model is to be used for
forecasting, we must assume that the features
of this model are constant through time,
particularly over future time periods.
• The reason for requiring stationary data is
simply that any model which is inferred from
these data can itself be interpreted as being
stationary or stable, and thus provide a valid
base for forecasting.
The Box-Jenkins (BJ) Methodology
• A crucial question when examining a series of
data is whether it follows a
• i. pure AR process,
• ii. a pure MA process,
• Iii. an ARMA process or
• Iii. an ARIMA process;
• we thus need to know the values of p, d and q
depending on the specific process.
The Box-Jenkins (BJ) Methodology Cont.
• The BJ methodology provides a means for establishing
this, and consists of four steps:
• Identification. This involves establishing the appropriate
values of p, d and q for which the correlogram and partial
correlogram can be used to assist;
• Estimation. This involves estimating the parameters of the
autoregressive and moving average terms in the model.
• This can sometimes be achieved through using OLS, but on
occasion non-linear estimation methods will have to be
used;
• Diagnostic checking. Having chosen a particular ARIMA
model, we now establish whether or not the model fits
the data reasonably, as it is possible that another
ARIMA model may fit the data as well or better.
• One simple test of the model chosen will be to see if
the residuals from the estimated model are white
noise. If they are, the model can be accepted and if not
another model must be formulated.
• The BJ methodology is thus more of an art than a
science, and is an iterative process;
• Forecasting. A feature of the ARIMA model is its
success in forecasting, particularly for short term
forecasts.
• Each case must, of course, be checked.

You might also like