SP5
SP5
23 May 2022
Today: Ch. 13 Stochastic Processes
Stochastic: random
Process: sequence of variables where the ordering is of importance.
5. stochastic processes 4 / 46
Example: speech
5. stochastic processes 5 / 46
Example: arrival times of data packets
5. stochastic processes 6 / 46
Example: binary bit pattern
5. stochastic processes 7 / 46
Example: handwritten digits
5. stochastic processes 8 / 46
Example: linear filtering of a stochastic process
5. stochastic processes 9 / 46
Description of a random process
5. stochastic processes 10 / 46
Description of a random process
Similar as for RVs, we can use the PDF/PMF to describe stochastic
processes:
[X (t1 ), · · · , X (tk ), · · · ]T ∼ fXt1 ,Xt2 ,··· ,Xtk ,··· (xt1 , xt2 , · · · , xtk , · · ·)
5. stochastic processes 11 / 46
Example: rectified sinusoid with random amplitude
Approach
5. stochastic processes 12 / 46
Example: rectified sinusoid with random amplitude
If cos(ωt) 6= 0:
(
0 x <0
FX (t) (x) = x
− 10| cos(ωt)|
1−e x ≥ 0.
5. stochastic processes 13 / 46
Example: rectified sinusoid with random amplitude
If cos(ωt) 6= 0:
(
dFX (t) (x) 0 x <0
fX (t) (x) = = 1
x
− 10| cos(ωt)|
dx 10| cos(ωt)| e x ≥ 0.
5. stochastic processes 14 / 46
Problem 13.2.1 (similar: 13.2.4)
Let W be an exponential random variable with PDF
(
e −w w ≥0
fW (w ) =
0 otherwise
Find the CDF FX (t) (x) of the time-delayed ramp process X (t) = t − W .
5. stochastic processes 15 / 46
Problem 13.2.1 (similar: 13.2.4)
Let W be an exponential random variable with PDF
(
e −w w ≥0
fW (w ) =
0 otherwise
Find the CDF FX (t) (x) of the time-delayed ramp process X (t) = t − W .
5. stochastic processes 15 / 46
Problem 13.2.1 (cont’d)
Combining the facts, we have
(
e −(t−x) x <t
FX (t) (x) = P[W ≥ t − x] =
1 x ≥t
• Treat t as a constant
• Calculate CDF FX (t) (x)
• Then calculate PDF fX (t) (x)
5. stochastic processes 16 / 46
Description of a random process
Notice that
[X (t1 ), · · · , X (tk ), · · · ]T ∼ fXt1 ,Xt2 ,··· ,Xtk ,··· (xt1 , xt2 , · · · , xtk , · · ·)
5. stochastic processes 17 / 46
Independent identically distributed (iid) random sequences
For an iid random sequence
Let Xn denote a (time discrete) iid random sequence with sample vector
X = [Xn1 , · · · , Xnk ]T .
5. stochastic processes 19 / 46
Example: Bernoulli process (cont’d)
(
p xk (1 − p)1−xk xk = 0, 1
PXk (xk ) =
0 otherwise
5. stochastic processes 20 / 46
Gaussian Process
Gaussian processes occur quite often in nature (remember the central
limit theorem!)
The process X (t) is a Gaussian stochastic process (sequence) if and
only if X = [X (t1 ) · · · X (tk )]T is a Gaussian random vector for any
integer k > 0 and any set of time instances t1 , t2 , · · · , tk .
5. stochastic processes 21 / 46
Gaussian Process
Remember that for Gaussian random vectors X = [X1 , X2 , · · · , XN ]T we
have: h i
exp − 21 (x − E[X ])T CX−1 (x − E[X ])
fX (x) =
(2π)N/2 det(CX )1/2
5. stochastic processes 22 / 46
Expected value of a stochastic process
5. stochastic processes 23 / 46
Autocovariance of a stochastic process
cov[X (t1 ), X (t2 )] indicates how much the process is likely to change
from t1 to t2 .
Large covariance: sample function unlikely to change.
Zero covariance: sample function expected to change rapidly.
5. stochastic processes 24 / 46
Autocovariance and autocorrelation
The covariance of a stochastic process at two different time instances is
called “autocovariance”:
5. stochastic processes 25 / 46
The Autocorrelation function
ZZ
RX [n, k] = E[Xn Xn+k ] = x y fXn ,Xn+k (x, y ) dxdy
XX
RX [n, k] = E[Xn Xn+k ] = xy P[Xn = x, Xn+k = y ]
x y
5. stochastic processes 26 / 46
Example: sinusoidal process
5. stochastic processes 27 / 46
Example: sinusoidal process
5. stochastic processes 28 / 46
Example: sinusoidal process
1
RX (t, τ ) = 6 cos(ωτ )
5. stochastic processes 29 / 46
Uncorrelated and orthogonal processes
5. stochastic processes 30 / 46
Problem 13.7.2
For the time-delayed ramp process X (t) = t − W from Problem 13.2.1,
find for any t ≥ 0:
(a) The expected value µX (t),
(b) The autocovariance function CX (t, τ ).
Hint: E[W ] = 1 and E[W 2 ] = 2.
5. stochastic processes 31 / 46
Problem 13.7.2
For the time-delayed ramp process X (t) = t − W from Problem 13.2.1,
find for any t ≥ 0:
(a) The expected value µX (t),
(b) The autocovariance function CX (t, τ ).
Hint: E[W ] = 1 and E[W 2 ] = 2.
5. stochastic processes 32 / 46
Problem 13.8.2
X = [X1 X2 ]T has expected value E[X ] = 0 and covariance matrix
2 1
CX =
1 1
Does there exist a stationary process X (t) and time instances t1 and t2
such that X is actually a pair of observations [X (t1 ) X (t2 )]T of the
process X (t)?
5. stochastic processes 33 / 46
Problem 13.8.2
X = [X1 X2 ]T has expected value E[X ] = 0 and covariance matrix
2 1
CX =
1 1
Does there exist a stationary process X (t) and time instances t1 and t2
such that X is actually a pair of observations [X (t1 ) X (t2 )]T of the
process X (t)?
However, stationarity of X (t) requires var[X (t1 )] = var[X (t2 )], which is
a contradiction.
5. stochastic processes 33 / 46
Stationary process
RX (t, τ ) = RX (τ )
CX (t, τ ) = CX (τ ) = RX (τ ) − E[X ]2
5. stochastic processes 34 / 46
Stationary process
Examples of stationary processes
5. stochastic processes 35 / 46
Problem 13.8.4
Let X (t) be a stationary continuous-time random process. By sampling
X (t) every ∆ seconds, we obtain the discrete-time random sequence
Yn = X (n∆). Is Yn a stationary sequence?
5. stochastic processes 36 / 46
Problem 13.8.4
Let X (t) be a stationary continuous-time random process. By sampling
X (t) every ∆ seconds, we obtain the discrete-time random sequence
Yn = X (n∆). Is Yn a stationary sequence?
If only the expected value and the autocorrelation function satisfy the
property of stationarity, we call this process wide sense stationary
(WSS).
5. stochastic processes 37 / 46
Wide-Sense Stationary (WSS) Processes
A process is wide-sense stationary, if and only if
– The expected value E[X (t)] does not depend on time:
E[X (t)] = c.
– The autocorrelation function only depends on the time difference
τ and not the absolute time t:
RX (t, τ ) = RX (τ )
or
RX [n, k] = RX (k)
5. stochastic processes 39 / 46
Problem 13.9.3 (important for later!)
5. stochastic processes 39 / 46
Problem 13.9.6
X (t) and Y (t) are independent wide sense stationary processes.
Determine if W (t) = X (t)Y (t) is wide-sense stationary.
5. stochastic processes 40 / 46
Problem 13.9.6
X (t) and Y (t) are independent wide sense stationary processes.
Determine if W (t) = X (t)Y (t) is wide-sense stationary.
True: Independence of X (t) and Y (t) implies
and
RX (0) ≥ 0
RX (τ ) = RX (−τ )
RX (0) ≥ |RX (τ )|
Example:
1
RX (t, τ ) = cos(ωτ )
6
5. stochastic processes 41 / 46
Cross correlation for Stochastic Processes
In addition to the autocorrelation function, we can also define the
cross-correlation between two stochastic processes:
RXY (t, τ ) = E[X (t)Y (t + τ )]
RXY [n, k] = E[Xn Yn+k ]
Two random processes X (t) and Y (t) are jointly wide sense
stationary, if X (t) and Y (t) are wide sense stationary and
RXY (t, τ ) = RXY (τ ).
5. stochastic processes 42 / 46
Example
Xn is a zero mean WSS stochastic process. Let Yn = (−1)n Xn .
5. stochastic processes 43 / 46
Problem 13.10.2
X (t) is a wide sense stationary random process. Let
(a) Y (t) = X (t + a)
Are Y (t) and X (t) jointly wide sense stationary?
5. stochastic processes 44 / 46
Problem 13.10.2
X (t) is a wide sense stationary random process. Let
(a) Y (t) = X (t + a)
Are Y (t) and X (t) jointly wide sense stationary?
Since E[Y (t)] = E[X (t + a)] = µX and
RY (t, τ ) = E[Y (t)Y (t + τ )]
= E[X (t + a)X (t + τ + a)] = RX (τ ) ,
we have verified that Y (t) is wide sense stationary. Next we calculate
the cross correlation:
RXY (t, τ ) = E[X (t)Y (t + τ )]
= E[X (t)X (t + τ + a)] = RX (τ + a) .
Since RXY (t, τ ) depends on the time difference τ but not on the
absolute time t, we conclude that X (t) and Y (t) are jointly wide sense
stationary.
5. stochastic processes 44 / 46
Problem 13.10.2 (cont’d)
Now repeat for
(b) Y (t) = X (at)
5. stochastic processes 45 / 46
Problem 13.10.2 (cont’d)
Now repeat for
(b) Y (t) = X (at)
5. stochastic processes 46 / 46