0 ratings0% found this document useful (0 votes) 169 views530 pagesPub - Time Series Theory and Methods PDF
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content,
claim it here.
Available Formats
Download as PDF or read online on Scribd
Springer Series in Statistics
Andrews/Herzberg: Data: A Collection of Problems from Many Fields for the Student and
Research Worker.
Anscombe: Computing in Statistical Science through APL.
Berger: Statistical Decision Theory and Bayesian Analysis, 2nd edition.
Brémaud: Point Processes and Queues: Martingale Dynamics.
Brockwell/Davis: Time Series: Theory and Methods.
D:haparidze: Parameter Estimation and Hypothesis Testing in Spectral Analysis of
Stationary Time Scries.
Farrell: Muttivariate Calculation.
Goodman!Kruskal: Measures of Association for Cross Classifications.
Hartigan: Bayes Theory.
Heyer: Theory of Statistical Experiments.
Jolliffe: Principal Component Analysis.
Kres: Statistical Tables for Multivariate Analysis.
Leadbetter!Lindgreu/Rootzén: Extremes and Related Properties of Random Sequences.
and Processes.
LeCam: Asymptotic Methods in Statistical Decision Theory.
Manoukian: Modern Concepts and Theorems of Mathematical Statistics.
Miller, Jr.: Simultaneous Statistical Inference, 2nd edition.
Mosteller/Wallace: Applied Bayesian and Classical Inference: The case of The Federalist
Papers.
Pollard: Convergence of Stochastic Processes.
PrattiGibbons: Concepts of Nonparametric Theory.
Sachs: Applied Statistics: A Handbook of Techniques, 2nd edition.
Seneta: Non-Negative Matrices and Markov Chains.
Siegmund: Sequential Analysis: Tests and Confidence Intervals.
Vapnik:: Estimation of Dependences Based on Empiricat Data.
Wolter: introduction to Variance Estimation.
Yaglom: Correlation Theory of Stationary and Related Random Functions I: Basic
Results.
Yaglom: Correlation Theory of Stationary and Related Random Functions I:
Supplementary Notes and References.
‘Peter J. Brockwell
Richard A. Davis
Time Series:
Theory and Methods
With 124 Illustrations
®
Springer-Verlag
New York Berlin Heidelberg
London Paris TokyoPeter J. Brockwell
Richard A. Davis
Department of Statisti
Colorado State University
Fort Collins, Colorado 80523
USA
Ms SE-MEYI
AMS Classification: 62-01, 62M 10
Library of Congress Cataloging in Publication Data
Brockwell, Peter J.
Time series.
(Springer series in statistics)
Bibliography: p.
Includes index.
1, Time-series analysis. 1. Davis, Richard A.
HL Title, IL. Series.
QA280.B76 1987 S19.5'5.—86-22047.
€ 1987 by Springer-Verlag New York Inc.
All rights reserved. This work may not be translated or copied in whole or in part without the
written permission of the publisher (Springer-Verlag, 175 Fifth Avenue, New York, New York
10010, USA), except for brief excerpts in connection with reviews or scholarly analysis. Use in
connection with any form of information storage and retrieval, electronic adaptation, computer
software, or by similar or dissimilar methodology now known or hereafter developed is forbidden.
Typesct by Asco Trade Typesetting Ltd., Hong Kong.
Printed and bound by R.R. Donnelley & Sons, Harrisonburg, Virginia.
Printed in the United States of America.
987654321
ISBN 0-387-96406-1 Springer-Verlag New York Berlin Heidelberg.
ISBN 3-$40-96406-1 Springer-Verlag Berlin Heidelberg New YorkTo our familiesPreface
We have attempted in this book to give a systematic account of linear time
series models and their application to the modelling and prediction of data
collected sequentially in time. The aim is to provide specific techniques for
handling data and at the same time to provide a thorough understanding of
the mathematical basis for the techniques. Both time and frequency domain
methods are discussed but the book is written in such a way that either
approach could be emphasized. The book is intended to be a text for graduate
students in statistics, mathematics, engineering, and the natural or social
sciences. It has been used both at the M.S. level, emphasizing the more
practical aspects of modelling, and at the Ph.D. level, where the detailed
mathematical derivations of the deeper results can be included.
Distinctive features of the book are the extensive use of elementary Hilbert
space methods and recursive prediction techniques based on innovations, use
of the exact Gaussian likelihood and AIC for inference, a thorough treatment
of the asymptotic behavior of the maximum likelihood estimators of the
coefficients of univariate ARMA models, extensive illustrations of the tech-
niques by means of numerical examples, and a large number of problems for
the reader. The companion diskette contains programs written for the IBM.
PC; which can be used to apply the methods described in the text. Data sets
can be found in the Appendix, and a more extensive collection (including most
of those used for the examples in Chapters 1,9, 10, 11 and 12)is on the diskette.
Simulated ARMA series can easily be generated and filed using the program
PEST. Valuable sources of additional time-series data are the collections of
Makridakis et al. (1984) and Working Paper 109 (1984) of Scientific Computing
Associates, DeKalb, Illinois.
Most of the material in the book is by now well-established in the time
series literature and we have thercfore not attempted to give credit for all thea Preface
results discussed. Our indebtedness to the authors of some of the well-known
existing books on time series, in particular Anderson, Box and Jenkins, Fuller,
Grenander and Rosenblatt, Hannan, Koopmans and Priestley will however
be apparent. We were also fortunate to have access to notes on time series by
W. Dunsmuir. To these and to the many other sources that have influenced
our presentation of the subject we express our thanks.
Recursive techniques based on the Kalman filter and state-space represen-
tations of ARMA processes have played an important role in many recent
developments in time series analysis. In particular the Gaussian likelihood of
a time series can be expressed very simply in terms of the one-step linear
predictors and their mean squared errors, both of which can be computed
recursively using a Kalman filter. Instead of using a state-space representation
for recursive prediction we utilize the innovations representation of an arbi-
trary Gaussian time series in order to compute best linear predictors and exact
Gaussian likelihoods. This approach, developed by Rissanen and Barbosa,
Kailath, Ansley and others, expresses the value of the series at time t in terms
of the one-step prediction errors up to that time. This representation provides
insight into the structure of the time series itself as well as leading to simple
algorithms for simulation, prediction and likelihood calculation.
These algorithms are used in the parameter estimation program (PEST)
found on the companion diskette. Given a data set of up to 2300 observations,
the program‘can be used to find preliminary, least squares and maximum
Gaussian likelihood estimators of the parameters of any prescribed ARIMA
model for the data, and to predict future values. It can also be used to simulate
values of an ARMA process and to compute and plot its theoretical auto-
covariance and spectral density functions. Data can be plotted, differenced,
deseasonalized and detrended. The program will also plot the sample auto-
correlation and partial autocorrelation functions of both the data itself and
the residuals after model-fitting. The other time-series programs are SPEC,
which computes spectral estimates for univariate or bivariate series based on
the periodogram, and TRANS, which can be used either to compute and plot
the sample cross-correlation function of two series, or to perform least squares
estimation of the coefficients in a transfer function model relating the second
series to the first (see Section 12.2). Also included on the diskette is a screen
editing program (WORD6), which can be used to create arbitrary data files,
and a collection of data files, some of which are analyzed in the book.
Instructions for the use of these programs are contained in the file HELP on
the diskette.
For a onc-semester course on time-domain analysis and modelling at the
MS. level, we have used the following sections of the book:
U1 1.6; 2.1-2.7; 3.1-3.5; 5.1-5.5; 7.1, 7.2; 8.1-8.9; 9.1-9.6,
(with brief reference to Sections 4.2 and 4.4). The prerequisite for this course
is a knowledge of probability and statistics at the level of the book Introduction
to the Theory of Statistics by Mood, Graybill and Boes.Preface : : ix
For a second semester, emphasizing frequency-domain analysis and multi-
variate series, we have used
4.1-4.4, 4.6-4.10; 10.1-10.7; 11.1-11.7; selections from Chap. 12.
At the M.S, level it has not been possible (or desirable} to go into the mathe-
matical derivation of all the results used, particularly those in the starred
sections, which require a stronger background in mathematical analysis and
measure theory. Such a background is assumed in all of the starred sections
and problems.
For Ph.D. students the book has been used as the basis for a more
theoretical one-semester course covering the starred sections from Chapters
4 through I! and parts of Chapter 12. The prerequisite for this course is a
knowledge of measure-theoretic probability.
We are greatly indebted to E.J. Hannan, R.H. Jones, S.I. Resnick, S. Tavare
and D. Tjostheim, whose comments on drafts of Chapters 1-8 led to sub-
stantial improvements. The book arose out of courses taught in the statistics
department at Colorado State University and benefitted from the comments
of many students. The development of the computer programs would not have
been possible without the outstanding work of Joc Mandarino, the architect
of the computer program PEST, and Anthony Brockwell, who contributed
WORD6, graphics subroutines and general computing expertise. We are
indebted also to the National Science Foundation for support for the research
related to the book, and one of us (PJ.B.) to Kuwait University for providing
an excellent environment in which to work on the early chapters. For permis-
sion to use the optimization program UNC22MIN we thank R. Schnabel of
the University of Colorado computer science department. Finally we thank
Pam Brock well, whose contributions to the manuscript went far beyond those
of typist, and the editors of Springer-Verlag, who showed great patience and
cooperation in the final production of the book.
Fort Collins, Colorado PJ. Brockwell
October 1986 R.A. DavisContents
Preface
CHAPTER |
Stationary Time Series
§1.1 Examples of Time Series
§1.2 Stochastic Processes
§1.3 Stationarity and Strict Stationarity
§1.4 The Estimation and Elimination of Trend and Seasonal Components
§1.5 The Autocovariance Function of a Stationary Process
§1.6 The Multivariate Normal Distribution
§1.7* Applications of Kolmogorov's Theorem
Problems
CHAPTER 2
Hilbert Spaces
§2.1 Inner-Product Spaces and Their Properties
§2.2 Hilbert Spaces
§23 The Projection Theorem
§24 Orthonormal Sets
§2.5 Projection in R"
§2.6 Linear Regression and the General Linear Model
§2.7 Mean Square Convergence, Conditional Expectation and Best Linear
Prediction in L(Q,¥, P) :
§28 Fourier Series
§29 Hilbert Space lsomorphisms
§2.10* The Completeness of L7(0, ¥, P)
§2.11* Complementary Results for Fourier Series
Problems :
W
14
25
32
37
39
42
42
46
48
62
65
67
69
73Contents
xii
CHAPTER 3
Stationary ARMA Processes nn
§3.1 Causal and Invertible ARMA Processes 1
§3.2. Moving Average Processes of Infinite Order 89
§3.3 Computing the Autocovariance Function of an ARMA(p,q) Process mn
§3.4 The Partial Autocorrelation Function 97
§3.5 The Autocovariance Generating Function 102
§3.6* Homogeneous Linear Difference Equations with Constant
Coefficients 104
Problems 108
CHAPTER 4
The Spectral Representation of a Stationary Process 2
§4.1 Complex-Valued Stationary Time Series 2
§4.2 The Spectral Distribution of a Linear Combination of Sinusoids 14
§4.3 — Herglotz’s Theorem us
§4.4 Spectral Densities and ARMA Processes 120
§4.5* Circulanis and Their Eigenvalues 128
§4.6* Orthogonal Increment Processes on [— 1,2] 133
§4.7* Integration with Respect to an Orthogonal Increment Process 135
§4.8* The Spectral Representation 138
§4.9* Inversion Formulae 145
§4.10* Time-Invariant Linear Filters 147
§4.11* Properties of the Fourier Approximation h, to Iy,0) 151
Problems 153
CHAPTER 5
Prediction of Stationary Processes 159
§5.1 The Prediction Equations ir, the Time Domain 159
§5.2. Recursive Methods for Computing Best Linear Predictors 162
§5.3 “Recursive Prediction of an ARMA(p,q) Process 168
§5.4 Prediction of a Stationary Gaussian Process; Prediction Bounds 175
§5.5 Prediction of a Causal Invertible ARMA Process in Terms of Xp
-a 0. 195
$6.3 Convergence in Distribution 197
$6.4 Central Limit Theorems and Related Results 202
a 208
ProblemsContents
CHAPTER 7
Estimation of the Mean and the Autocovariance Function
§7.1 Estimation of »
§7.2 Estimation of y/-) and p(-)
§7.3* Derivation of the Asymptotic Distributions
Problems
CHAPTER 8
Estimation for ARMA Models
§8.1 The Yule-Walker Equations and Parameter Estimation for
Autoregressive Processes
§8.2 Preliminary Estimation for Autoregressive Processes Using the
Durbin-Levinson Algorithm
§8.3 Preliminary Estimation for Moving Average Processes Using the
Innovations Algorithm
§84 Preliminary Estimation for ARMA(p,q) Processes
§8.5 Remarks on Asymptotic Efficiency
§8.6 Recursive Calculation of the Likelihood of an Arbitrary Zero-Mean.
Gaussian Process
§8.7. Maximum Likelihood and Least Squares Estimation for ARMA
Processes
§8.8 Asymptotic Properties of the Maximum Likelihood Estimators
§8.9 Confidence Intervals for the Parameters of a Causal Invertible
ARMA Process
§8.10* Asymptotic Behavior of the Yule-Walker Estimates
§8.11* Asymptotic Normality of Parameter Estimators
Problems
CHAPTER 9
Model-Building and Forecasting with ARIMA Processes
§9.1 ARIMA Models for Non-Stationary Time Series
§9.2 Identification Techniques
§9.3. The AIC Criterion
$9.4 Diagnostic Checking
$9.5 Forecasting ARIMA Models
§9.6 Seasonal ARIMA Models
Problems
CHAPTER 10
Inference for the Spectrum of a Stationary Process
$10.1 The Periodogram
$10.2 Testing for the Presence of Hidden Periodi
§10.3 Asymptotic Properties of the Periodogram
§10.4 Smoothing the Periodogram
§10.5 Confidence Intervals for the Spectrum
§10.6 Autoregressive, Maximum Entropy, Moving Average and Maximum
Likelihood ARMA Spectral Estimators
§10.7 The Fast Fourier Transform (FFT) Algorithm
2
2
213
218
231
232
234
238
243
247
249
251
253
255
258
262
265
266
276
293
310
316
320
321
324
332
352
355
363xiv Contents
§10.8* Derivation of the Asymptotic Behavior of the Maximum Likelihood
and Least Squares Estimators of the Coefficients of an ARMA
Process 365
Problems : 386
CHAPTER 11
Multivariate Time Series 391
§11.1 Second-Order Properties of Multivariate Time Series 392
§11.2 Estimation of the Mean and Covariance Function : 395
§11.3 | Multivariate ARMA Processes 407
§11.4 Best Linear Predictors of Second-Order Random Vectors 41
§11.5 Estimation for Multivariate ARMA Processes 417
§11.6 The Cross Spectrum 419
§11.7 Estimating the Cross Spectrum. 428
§11.8* The Spectral Representation of a Multivariate Stationary Time
Series 439
Problems 444
CHAPTER 12
Further Topics 447
$12.1. Kalman Filtering and Prediction 447
§12.2 Transfer Function Modelling 454
§12.3 Parameter Estimation for ARMA Processes with Missing Values 462
$12.4 Long Memory Processes 464
§12.5 Linear Processes with Infinite Variance 478
$12.6 Threshold Models 489
§12.7 Estimation of Missing Observations of an ARMA Process 494
Problems 497
Appendix: Data Sets 499
Bibliography 504
Index 509CHAPTER 1
Stationary Time Series
In this chapter we introduce some basic ideas of time series analysis and
stochastic processes. Of particular importance are the concepts of stationarity
and the autocovariance and sample autocovariance functions. Some standard
techniques are described for the estimation and removal of trend and season-
ality (of known period) from an observed series. These are illustrated with
reference to the data sets in Section 1.1. Most of the topics covered in this
chapter will be developed more fully in later sections of the book, The reader
who is not already familiar with random vectors and multivariate analysis
should first read Section 1.6 where a concise account of the required
background is given. Notice our convention that an n-dimensional random
vector is assumed (unless specified otherwise) to be a column vector X =
(X;,X2,...,X,) of random variables. If S is an arbitrary set then we shall use
the notation S* to denote both the set of n-component column vectors with
components in S and the set of n-component row vectors with components
in S.
§1.1 Examples of Time Series
A time series is a set of observations x,, each one being recorded at a specified
time ¢. A discrete-time series (the type to which this book is primarily devoted)
is one in which the set Tp of times at which observations are made is a discrete
set, as is the case for example when observations are made at fixed time
intervals. Continuous-time series are obtained when observations are recorded
continuously over some time interval, e.g. when Tp = [0,1]. We shall use the
notation x(t) rather than x, if we wish to indicate specifically that observations
are recorded continuously.1. Stationary Time Series
ExampLe 1.1.1 (Current Through a Resistor). If a sinusoidal voltage v(t) =
acos(vt + 0) is applied to a resistor of resistance r and the current recorded
continuously we obtain a continuous time series
x(t) =r“ aces(vt + 0).
If observations are made only at times 1, 2, the resulting time series will
be discrete. Time series of this particularly simple type will play a fundamental
role in our later study of stationary time series.
° 10 20 30 40 SO 60 70 80 90 100
_ Figure 1.1. 100 observations of the series x(t) = cos(.21 + n/3).§1.1. Examples of Time Series 3
EXxamPLe 1.1.2 (Population x, of the U.S.A., 1790-1980).
t x t x
1790 3,929,214 1890 62,979,766
1800 5,308,483 1900 76,212,168
1810 7,239,881 1910 92,228,496
1820 9,638,453 1920 106,021,537
1830 12,860,702 1930 123,202,624
1840 17,063,353 1940 132,164,569
1850 23,191,876 1950 151,325,798
1860 31,443,321 1960 179,323,175
1870 38,558,371 1970 -—.203,302,031
1880 50,189,209 1980 226,545,805
260
240
220
200
160
160
140
T T 4
1780, 1830 1880 1930 1980
Figure 1.2. Population of the U.S.A. at ten-year intervals, 1790-1980 (U.S. Bureau of
the Census).1. Stationary Time Series
EXamPLe 1.1.3 (Strikes in the U.S.A., 1951-1980).
t x t x
‘ 1951 473719664405
1952 Su7 1967 4595
1953 S091 1968 5045
1954 3468 1969 5700
1955 4320 1970 5716
1956 3825 1971 5138
1957 3673 1972 S010
1958 3694 1973 5353
1959 3708 «19746074
1960 3333 1975 5031
1961 3367 1976 5648
1962 3614 1977 5506
1963 3362 1978 4230
1964 3655 1979 4827
1965 3963 1980. 3885
7
6
a 5
3
?
é
8 ls
3
= «4
3
7 Sealant genteel —— 1
1950 1955, 1960 1965 1970 1975,
1980
Figure 1.3. Strikes in the U.S.A, 1951-1980 (Bureau of Labor Statistics, U.S. Labor
Department).§1.1. Examples of Time Series
Exampte 1.1.4 (All Star Baseball Games, 1933-1980).
if the National League won in year t,
wh
11900 33 34 35 36
x -b-l-t 1
t-1900 49 50 SI 52
iy
t-1900 65 66 67 68
i 1 1 1 1
= no game.
* = two games scheduled.
if the American League won in year t.
37
-t
53
A
69
1
38 39 «40 41
b-l ot -8
5455 56 57
-1 1 ot-t
m 71 72 «73
1-1 1 1
42° 43
5859
-1
4 75
1
1
44 45 46 47 4
to -t-t
60 61 62 63 6
1
6 77 7% 1 8
1ot
1930 1935 1940
1945
1950 1955 1960
1965
1970
1975 1980
Figure 1.4. Results x, Example 1.1.4, of All-star baseball games, 1933-1980.6 : 1. Stationary Time Series
ExampLe 1.1.5 (Wélfer Sunspot Numbers, 1770-1869).
1770 101, 1790 901810 0 1830 71 1850
wn 82 1791 67 1811 1 1831 4a 1851
mT 66 1792 60 1812 5 1832 28 1852
1773 35 1793 47 1813 2 1833 8 1853
1774 310 «1794 44 1814 14 1834 13 1854
1775 7 1795 21 1815 35 1835 57 1855
1776 20 1796 16 1816 46 1836 122 1856
1777 92 1797 6 1817 41 1837 138 1857
1778 154 1798 4 1818 30 1838 103 1858
1779 125 1799 a 1819 24 1839 86 1859
1780 85 1800 14 1820 16 1840 63 1860
1781 68 1801 34 1821 7 1841 37 1861
1782 38 1802 45 1822 4 1842 24 1862
1783 23 1803 43 1823 2 1843 m 1863
1784 10-1804 48 1824 8 1844 15 1864
1785 4 1805 42 1825 17 1845 40 1865
1786 83 1806 28 1826 36 1846 62 1866,
1787 132-1807, 10 1827, 508847 98 = 1867
1788 131 1808 8 1828 62 1848 124 1868
1789 Ng 1809 2 1829 67 1849 % 1869
RPSSRSSESIRELBAGL SRR
160
150
140
130
120
110
100 4
90 4
80
70
60
50
40
30
20
10
1770 1780 1790 1800 1810 1620 1830 1840 1850 1860 1870
Figure 1.5. The Wélfer sunspot numbers, 1770-1869.§1.1. Examples of Time Series 7
Exam e 1.1.6 (Monthly Accidental Deaths in the U.S.A., 1973-1978).
1973 1974 1975 1976 1977 1978
Jan. 9007 7150 8162 In? 7792 7836
Feb. 3106 6981 7306 7461 6957 6892
Mar. 3928 8038 8124 7776 7726 7791
Apr. 9137 8422 7870 7928 8106 8129
May 10017 8714 9387 8634 8890 OS
Jun. 10826 9512 9556 8945 9299 9434
Jul. 11317 10120 10093 10078 10625 10484
Aug. 10744 9823 9620 9179 9302 9827
Sep. 9713 8743 8285 8037 8314 9110
Oct. 9938 9129 8433 8488 8850 9070
Nov. 9161 8710 8160 7874 8265 8633
Dec. 8927 8680 8034 8647 8796 9240
+ (Thousonds)
° 12 24 36 48 60 72
Figure 1.6. Monthly accidental deaths in the U.S.A. 1973-1978 (National Safety
Council).1. Stationary Time Series
These examples are of course but a few of the multitude of time series to
be found in the fields of engineering, science, sociology and economics. Our
purpose in this book is to study the techniques which have been developed
for drawing inferences from such series. Before we can do this however, it is
necessary to set up a hypothetical mathematical model to represent the data.
Having chosen a model (or family of models) it then becomes possible to
estimate parameters, check for goodness of fit to the data and possibly to use
the fitted model to enhance our understanding of the mechanism generating
the series. Once a satisfactory model has been developed, it may be used
in a variety of ways depending on the particular field of application. The
applications include separation (filtering) of noise from signals, prediction of
future values of a series and the control of future values.
The six examples given show some rather striking differences which are
apparent if one examines the graphs in Figures 1.1—1.6. The first gives rise to
a smooth sinusoidal graph oscillating about a constant level, the second to a
roughly exponentially increasing graph, the third to a graph which fluctuates
erratically about a nearly constant or slowly rising level, and the fourth to an
erratic series of minus ones and ones. The fifth graph appears to have a strong
cyclic component with period about 11 years and the last has a pronounced
seasonal component with period 12.
_ In the next section we shall discuss the general problem of constructing
mathematical models for such data.
§1.2 Stochastic Processes
The first step in the analysis of a time series is the selection of a suitable
mathematical model (or class of models) for the data. To allow for the possibly
unpredictable nature of future observations it is natural to suppose that each
observation x, is a realized value of a certain random variable X,. The time
series {x,,f€ Tp} is then a realization of the family of random variables
{X,,1€ To}. These considerations suggest modelling the data as a realization
(or part ofa realization) of a stochastic process {X,,t¢ T} where T > Ty. To
clarify these ideas we need to define precisely what is meant by a stochastic
process and its realizations. In later sections we shall restrict attention to
special classes of processes which are particularly useful for modelling many
of the time scries which are encountered in practice.
Definition 1.2.1 (Stochastic Process). A stochastic process is a family of random
variables {X,,¢¢ T} defined on a probability space (Q, F, P).
Remark 1. In time series analysis the index (or parameter) set T is a set of time
points, very often {0, + 1, +2,...}, {1,2,3,...}, [0, 0c) or(—00, x). Stochastic
processes in which T is not a subset of R are also of importance. For example
in geophysics stochastic processes with T the surface of a sphere are used to§1.2. Stochastic Processes 9
represent variables indexed by their location on the earth’s surface. In this
book however the index set 7 will always be a subset of R.
Recalling the definition of a random variable we note that for each fixed
teT, X, is in fact a function X,(-) on the set Q. On the other hand, for each
fixed wEQ, X (w) is a function on T.
Definition 1.2.2 (Realizations of a Stochastic Process). The functions
{X,(w),@eQ} on T are known as the realizations or sample-paths of the
process {X,,te T}.
Remark 2. We shall frequently use the term time series to mean both the data
and the process of which it is a realization.
The following examples illustrate the realizations of some specific stochastic
processes. The first two could be considered as possible models for the time
series of Examples 1.1.1 and 1.1.4 respectively.
EXAMPLE 1.2.1 (Sinusoid with Random Phase and Amplitude). Let A and ©
be independent random variables with A > 0 and © distributed uniformly on
[0, 22). A stochastic process { X(t), ¢€ R} can then be defined in terms of A and
@ for any given vy > O andr > 0 by
X,=r-"Acos(vt + O), (1.2.1)
or mote explicitly,
X,(@) =r" A(a)cos(vt + O(@)), (1.2.2)
where w is an element of the probability space Q on which A and © are defined.
The realizations of the process defined by 1.2.2 are the functions of t
obtained by fixing «, i.e. functions of the form
x(t) = r7acos(vt + 0).
The time series plotted in Figure 1.1 is one such realization.
“ExaMPLE 1.2.2 (A Binary Process). Let {X,,t = 1,2,...} be a sequence of
independent random variables for each of which
P(X, = 1) = P(X, = -1) =}. (1.2.3)
In this case it is not so obvious as in Example 1.2.1 that there exists a
probability space (Q,F, P) with random variables X,, X2, ... defined on Q
having the required joint distributions, i.e. such that
P(X, = i. X2 = by Xe = =I, (1.2.4)
for every n-tuple (i,,..-,i,) of I's and —1’s. The existence of such a process is
however guarantced by Kolmogorov’s theorem which is stated below and
discussed further in Section 1.7.