0% found this document useful (0 votes)
7 views3 pages

HSTS203 AugY18

Uploaded by

Junior Dhoro
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views3 pages

HSTS203 AugY18

Uploaded by

Junior Dhoro
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

UNIVERSITY OF ZIMBABWE

BSc Honours in Statistics Level 2 HSTS203

TIME SERIES ANALYSIS

November/December 2018
Time : 2 hours

Candidates should attempt ALL questions in section A and TWO questions in section B.
Marks will be allocated as indicated

SECTION A (40 marks)


Candidates may attempt ALL questions being careful to number them A1 to A5.

A1. Let {at } be zero mean white noise. Show that the autocorrelation functions for the
following processes are the same:

(a) Zt = at + 41 at−1 . [2]


(b) Zt = at + 4at−1 . [2]

A2. Write down the equations for the following models:

(a) ARIMA(1,1,2). [4]


(b) ARIMA(1,1,0)(0,1,1)4. [4]

A3. Suppose {Yt } is an integrated process is given by:

Yt = 2Yt−1 − 1.25Yt−2 + 0.25Yt−3 + at

where {at } is a white noise process with E(at ) = 0 and V ar(at ) = σa2 . Show that Yt is
non stationary and hence find the order of integration. [4,1]

A4. Consider the following AR(2) process


1
Zt = Zt−1 − Zt−2 + at
2
(a) Show that the model is stationary. [3]

page 1 of 3
HSTS203

(b) Showing all your working, deduce that the autocorrelation function of Zt is given
by:
  k2     
1 kπ 1 kπ
ρk = cos + sin for k = 0, ±1, ±2, . . . [8]
2 4 3 4

A5. Consider an AR(1) model given by:

Zt = φZt−1 + at

(a) Use the method of Least Squares Estimation to estimate the parameter φ. [6]
(b) Use the method of Maximum Likelihood Estimation to estimate the
parameter φ. [6]

SECTION B (60 marks)


Candidates may attempt TWO questions being careful to number them B6 to B8.

B6. (a) Given a general linear process



X
Zt = ψj at−j = Ψ(B)at
j=0
P∞
where j=0 ψ 2 < ∞ and at is a white noise process with variance σa2 .
(i) Show that the process is stationary. [10]
(ii) Let γ(B) be the autocovariance generating function. Show that

γ(B) = σa2 ψ(B)ψ(B −1 ) [4]

.
(b) Let {Zt } be a general linear process expressed as:
∞  j
X 1
Zt = at−j
j=0
3

where {at } is a white noise process. Find the autocovariance generating func-
tion. [8]
(c) Let {γ(k)} be the autocovariance function of a stationary time series {Zt }.
(i) Define the autocovariance generating function of Zt . [2]
(ii) Suppose 
 1 k=0
1
γ(k) = k = ±1
 2
0 otherwise
Find the autocovariance generating function of Zt . [6]

page 2 of 3
HSTS203

B7. (a) Suppose {Zt } is an ARMA(1,1) model expressed as

Zt = φZt−1 + at − θat−1

where {at } is a white noise process. Show that the autocorrelation function of Zt
is given by
(1 − θφ)(φ − θ) k−1
ρk = φ for k ≥ 1 [10]
1 − 2θφ + θ2
(b) Consider the second order moving average process {Zt } given by:

Zt = at + 0.7at−1 − 0.2at−2

where {at } is a white noise process. Find


(i) the mean of Zt . [1]
(ii) the variance of Zt . [2]
(iii) the autocorrelation function of Zt . [7]
(c) Consider the following AR(2) process
2 1
Zt = Zt−1 − Zt−2 + at
5 25
(i) Show that the model is stationary. [2]
(ii) Showing all your working, deduce that the autocorrelation function of Zt is
given by:
     k
24 1
ρk = 1 + k for k = 0, ±1, ±2, . . . [8]
26 5

B8. (a) For an AR(p) process given by:

Zt = φ1 Zt−1 + φ2 Zt−2 + . . . + φp Zt−p + at

(i) Determine the Yule-Walker equations for this process. [4]


(ii) Show how the method of moments is used to estimate the parameters φ1 to
φp . [8]
(b) Suppose a series Zt , Zt−1 , . . . from an AR(1) process given by

(1 − φ1 B)(Zt − µ) = at

(i) Find the l-step ahead forecast Ẑt (l) of Zt+l and the variance V ar[et (l)],
l = 1, 2, 3 where et (l) = Zt+l − Ẑt (l) is the l-step ahead forecast error. [10]
(ii) Show that Ẑt (l) is a minimum square error forecast. [8]

END OF EXAMINATION

page 3 of 3

You might also like