0% found this document useful (0 votes)
14 views

Univariate Time Series Analysis

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views

Univariate Time Series Analysis

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 20

Univariate Time Series Analysis

Modeling & Forecasting


Time series analysis
• Sometimes, we are interested in examining behaviour of a series/ variable
(say, yt), for which it is very difficult to come out with a structural model
(e.g., yt = b0 +b1*xt + errort) because of:

• Measurement of explanatory variables needed to explain this variable is difficult;

• Frequency of such explanatory variables is not in tune with the variable to be


explained (yt). For example, we are interested in studying some variable based on
daily data; however, variables needed are available at very low frequency (say,
Monthly, Quarterly, etc.).

• Further, it is observed that structural models are not very good for ‘out-of-sample’
forecast.
Types of Time Series Models

• Stationary Time Series Models

• For example: AR, MA, ARMA, ARIMA, EWMA, ARCH, GARCH, EGARCH,
TGARCH, etc.

• Non-stationary Time Series Models

• For Example: COINTEGRATION & VECTOR ERROR CORRECTION


MODELS (VECM)
Stationarity condition for Auto-Regressive (AR) models

• Condition of Stationarity means that the series is fairly stable over time
and, therefore, the history is useful for future predictions as, for example, we
attempt to predict returns using only information contained in their past values.

• Stationarity condition is needed to ensure STABILITY of analysis.

Stationarity Conditions:

(A) Strict Stationary: All the four moments defining distribution of


observations are time invariant. That is, the joint probability of observing n
values is the same irrespective of time.
P{yt1  b1,..., ytn  bn }  P{yt1  m  b1,..., ytn  m  bn }

(B) Covariance Stationary: Assumes that only first two moments are time
invariant. (Practically applicable)
Covariance Stationarity
Covariance or Weakly Stationary Process

If a series satisfies the following three conditions, it is said to be


weakly or covariance stationary series.

Mean (Unconditional/ Long-run ):


E(yt) =  (is a finite constant), t = 1,2,...,

Variance (Unconditional/ Long-run ):


E ( yt   )( yt   )   2   (is a finite constant),

Covariance:
E(yt , yt-s) = γS
(depends only on lag, irrespective of time, they are measured  s=1,2,3…..)
Non-Stationary Vs. Stationary Series

NON-STATIONARY PROCESS STATIONARY PROCESS


Stationarity of a Series
• Why do we need stationarity?
• Stationarity in data leads to stable results. A non-stationary data may lead
to instable coefficients (especially in univariate-time series analysis).

• How do we detect it?

1. Correlograms (Visual inspection)


• ACF Plots
• PACF Plots

2. Formal statistical tests to detect stationarity of a series:


• Dickey-Fuller (DF test)

• Augmented Dickey-Fuller (ADF test)

3. Roots or Zeros of ‘Lag Polynomial’ of a series.


AUTO CORRELATION FUNCTION (ACF)
&
PARTIAL AUTO CORRELATION FUNCTION (PACF)

• Autocorrelation function shows relation between autocorrelation


coefficients and corresponding lags. Autocorrelation between Y t and its
lagged values Yt-k (That is, at lag k) is determined as follows:
Partial Auto Correlation Function (PACF)

• In time series data, typically, a large proportion of correlation between Y t


& Yt-k happens due to their correlation with intermediate variables.

• PACF measures correlation between Yt and Yt-k after removing the


correlation that Yt could have with the intermediate lags (Y t-1, Yt-2, …, Yt-
k+1).
ACF & PACF for Different Processes

Process ACF PACF


White noise No significant coefficients No significant coefficients

AR(2) Geometrically declining or First 2 pacf coefficients


damped sinusoid acf significant, all others
insignificant

MA(1) First acf coefficient Geometrically declining or


significant, all others damped sinusoid pacf
insignificant
ARMA(1,1) Geometrically declining or Geometrically declining or
damped sinusoid acf damped sinusoid pacf
Correlograms (ACF & PACF) of a
Non-Stationary Time Series

Unit root
Correlograms (ACF & PACF) of a
Stationary Time Series
FORMAL TESTS FOR STATIONARITY
Dickey Fuller & Augmented Dickey Fuller Tests

• AR models & condition for stationarity

• AR(1) model yt    1 yt 1  ut
1 1
• Stationarity Condition

• Testing the unit root hypothesis using Dickey-Fuller (DF) test:


Null Hypothesis (H0 : = 0) Alternate Hypothesis (H1 : < 0)
AR(p) model

y t     1 y t 1   2 y t  2  ...   p y t  p  u t

MA (q) models

rt   0  1 * et 1   2 * et 2  et
Diagnostics of the fitted Model
Joint Hypothesis Tests

• If the model is fitted correctly, the error term left should become a white noise process;

• It can be verified using ACF & PACF.

• OR, We can also test the joint hypothesis that all m of the k correlation coefficients are
simultaneously equal to zero using the Q-statistic, called the Ljung-Box statistic:
m
 k2
Q  T T  2 

~  m2
k 1 T k

where T = sample size, m = maximum lag length

NULL HYPOTHESIS TESTED IS:


H 0 :  1   2   3 ...   m  0

• This statistic is very useful as a portmanteau (general) test of linear dependence in time series.
Summary of the Behaviour of Correlograms for
AR and MA Processes

An autoregressive process has

• a geometrically decaying acf


• number of spikes of pacf = AR order

A moving average process has

• Number of spikes of acf = MA order


• a geometrically decaying pacf
MA(1) model with Theta = 0.5

Series MA Series MA

0.4
1.0

0.3
0.8

0.2
0.6

Partial ACF
ACF

0.1
0.4

0.0
0.2

-0.1
0.0

-0.2

0 10 20 30 40 50 0 10 20 30 40 50

Lag Lag
MA(1) model with Theta = -0.5
Series MA Series MA
1.0

0.0
0.8

-0.1
0.6
0.4

Partial ACF
ACF

-0.2
0.2
0.0

-0.3
-0.2

-0.4
-0.4

0 10 20 30 40 0 10 20 30 40

Lag Lag
ARMA Processes
• ARMA models are capable of capturing very complex patters of temporal
correlation.
• Thus, they are a useful and interesting class of models.
• In fact, they can capture any valid autocorrelation!
• By combining the AR(p) and MA(q) models, we can obtain an ARMA(p,q)
model:  ( L) y t     ( L)u t

 ( L)  1  1 L  2 L2 ... p Lp
where
 ( L)  1  1 L   2 L2  ...   q Lq
and
y t    1 y t 1   2 y t  2  ...   p y t  p   1u t 1   2 u t  2  ...   q u t  q  u t
or
E (u t )  0; E (u t2 )   2 ; E (u t u s )  0, t  s
with
Estimating ARMA model

• Step 1: Preparing Data


• Step 2: Installing Required Packages: Tseries & Forecast
• Step 3: Import data in R studio
• Step 4: Stationarity check (see acf & pacf plot, check if mean in constant & run adf test)
• Step 5.1: If data is stationary, apply auto.arima model to find the best fit of ARMA model
• Step 5.2: If data is not stationary, run a log difference of series and again check the
stationarity, now if series is stationary, apply auto.arima model to find the best fit of ARMA
model
• Step 6: Develop ARMA model
• Step 7: Diagnostics of the fitted Model (It can be verified using ACF & PACF or, test the
Ljung-Box statistic)
• Check the robustness of the model.

You might also like