Tsa
Tsa
● A trend with a seasonal pattern means that the data exhibits both a
long-term increase or decrease (trend) and short-term, repeating
fluctuations at fixed intervals (seasonality). This means:
1. The trend shows a general upward or downward movement over time.
2. The seasonality introduces periodic variations that repeat at regular
intervals (e.g., daily, monthly, yearly).
● Expert Judgment:
● Delphi Method:
CHAPTER 2:
● Autocorrelation (ACF) and partial autocorrelation (PACF) are
statistical measures that help analyze the relationship between time
series and its lagged values
Covariance
Definition:
Covariance measures how two random variables vary together.
Scope:
It applies to any two variables (they can be different).
Units:
Its value is not standardized and depends on the units of the variables.
Interpretation:
A positive covariance indicates that the variables tend to move in the
same direction, while a negative covariance indicates they move in
opposite directions.
Autocorrelation
Definition:
Autocorrelation is a specific application of covariance to time series data.
It measures the correlation of a time series with a lagged version of itself.
Scope:
It focuses on the relationship within the same variable over time,
capturing patterns like seasonality or persistence.
Interpretation:
It tells you how much past values of a series are linearly related to its
current values.
● ACF tells you the overall correlation between past and current values,
including both direct and indirect effects.
A stochastic process:
● is composed of random variables, typically denoted as X(t) where t
represents time. Each X(t) captures the state of the process at time t.
This means the process has a constant mean and variance over time.
Xt={−0.5,1.2,−0.3,0.8,−1.1,0.4,−0.7,1.5,−0.2,0.6,…}
When k=0 (i.e., comparing a value with itself), the numerator is simply
Y1=e1
Y2=e1+e2
Yn=e1+e2+...+en
Or, more generally:
Yt=Y(t−1)+et , for t>1 Yt is the sum of all past shocks
● When 𝑘 is small (e.g., 1, 2, or 3), 𝑌𝑡 and yt-k are close together in time,
Yt−k are in time.
This means that observations that are close together in time tend to
be more similar (high correlation), while observations that are farther
apart in time tend to be less related (low correlation).
Autoregressive process:
● the present value depends on the previous value but with a
reduced impact. (of the error term)
● ro : determines how much of the previous value influences Yt
● if ro (closer to 1): Yt would be strongly dependent on Yt−1,
leading to a persistent trend.
● if ro (closer to 0), the process would behave more like white
noise.
Stationarity:
● Statistical properties of a time series do not change over time
● does not have a trend or seasonality—its statistical behavior remains
the same throughout (mean and variance are constant over time) and
fluctuates around a constant mean.
● The autocovariance depends only on the time lag (time
difference) not on the specific time 𝑡.
Yes! A time series with no trend and no seasonality is a strong candidate
for being stationary. However, to be strictly stationary, it must satisfy
the following conditions:
1. Constant Mean – The average value of the series does not change
over time.
2. Constant Variance – The fluctuations (spread of values) remain
consistent over time.
3. Constant Autocovariance – The relationship between
observations at different time points (lag structure) remains the
same over time.
CHAPTER 3:
● Exponential smoothing reduces the weight of older observations by
applying an exponentially decreasing weight factor. This is done using a
smoothing parameter, α\alpha (0 < α ≤ 1), which determines how
much weight is given to the most recent observation versus past
observations.
● As k increases, (1−α)^k gets smaller, meaning older
observations contribute less to the forecast.
alpha ontrols how much weight is given to the most recent observation
in SES.
Mean: 0.
○ Causal models
● Is a time series where each value is derived from the previous value
plus random noise
● If a time series has a trend but no seasonality, the best ETS model is:
ETS(A, A, N)
● The function acf() in R is used for: Calculating autocorrelation
● The main advantage of using deep learning models for time series
forecasting is:
● ARIMA
● Exponential Smoothing