Univariate Time Series Analysis
Univariate Time Series Analysis
• Further, it is observed that structural models are not very good for ‘out-of-sample’
forecast.
Types of Time Series Models
• For example: AR, MA, ARMA, ARIMA, EWMA, ARCH, GARCH, EGARCH,
TGARCH, etc.
• Condition of Stationarity means that the series is fairly stable over time
and, therefore, the history is useful for future predictions as, for example, we
attempt to predict returns using only information contained in their past values.
Stationarity Conditions:
(B) Covariance Stationary: Assumes that only first two moments are time
invariant. (Practically applicable)
Covariance Stationarity
Covariance or Weakly Stationary Process
Covariance:
E(yt , yt-s) = γS
(depends only on lag, irrespective of time, they are measured s=1,2,3…..)
Non-Stationary Vs. Stationary Series
Unit root
Correlograms (ACF & PACF) of a
Stationary Time Series
FORMAL TESTS FOR STATIONARITY
Dickey Fuller & Augmented Dickey Fuller Tests
• AR(1) model yt 1 yt 1 ut
1 1
• Stationarity Condition
y t 1 y t 1 2 y t 2 ... p y t p u t
MA (q) models
rt 0 1 * et 1 2 * et 2 et
Diagnostics of the fitted Model
Joint Hypothesis Tests
• If the model is fitted correctly, the error term left should become a white noise process;
• OR, We can also test the joint hypothesis that all m of the k correlation coefficients are
simultaneously equal to zero using the Q-statistic, called the Ljung-Box statistic:
m
k2
Q T T 2
~ m2
k 1 T k
• This statistic is very useful as a portmanteau (general) test of linear dependence in time series.
Summary of the Behaviour of Correlograms for
AR and MA Processes
Series MA Series MA
0.4
1.0
0.3
0.8
0.2
0.6
Partial ACF
ACF
0.1
0.4
0.0
0.2
-0.1
0.0
-0.2
0 10 20 30 40 50 0 10 20 30 40 50
Lag Lag
MA(1) model with Theta = -0.5
Series MA Series MA
1.0
0.0
0.8
-0.1
0.6
0.4
Partial ACF
ACF
-0.2
0.2
0.0
-0.3
-0.2
-0.4
-0.4
0 10 20 30 40 0 10 20 30 40
Lag Lag
ARMA Processes
• ARMA models are capable of capturing very complex patters of temporal
correlation.
• Thus, they are a useful and interesting class of models.
• In fact, they can capture any valid autocorrelation!
• By combining the AR(p) and MA(q) models, we can obtain an ARMA(p,q)
model: ( L) y t ( L)u t
( L) 1 1 L 2 L2 ... p Lp
where
( L) 1 1 L 2 L2 ... q Lq
and
y t 1 y t 1 2 y t 2 ... p y t p 1u t 1 2 u t 2 ... q u t q u t
or
E (u t ) 0; E (u t2 ) 2 ; E (u t u s ) 0, t s
with
Estimating ARMA model