0% found this document useful (0 votes)
12 views54 pages

SP5

The document discusses stochastic processes, defining them as sequences of random variables where the ordering is significant. It covers various examples, including speech and image compression, and explains statistical descriptions such as mean and autocorrelation. Additionally, it addresses independent identically distributed sequences and Gaussian processes, highlighting their characteristics and mathematical formulations.

Uploaded by

iliasukasim19
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views54 pages

SP5

The document discusses stochastic processes, defining them as sequences of random variables where the ordering is significant. It covers various examples, including speech and image compression, and explains statistical descriptions such as mean and autocorrelation. Additionally, it addresses independent identically distributed sequences and Gaussian processes, highlighting their characteristics and mathematical formulations.

Uploaded by

iliasukasim19
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 54

EE2S31 Signal Processing – Stochastic Processes

Lecture 5: Stochastic processes – Ch. 13

Alle-Jan van der Veen

23 May 2022
Today: Ch. 13 Stochastic Processes
Stochastic: random
Process: sequence of variables where the ordering is of importance.

Random variable: Mapping from an outcome s in the sample space


to a real number x(s).
Stochastic process: Mapping from an outcome s in the sample
space to a function x(t, s), which depends on an ordering variable like
time or space: a random signal.
5. stochastic processes 2 / 46
Stochastic Processes
The stochastic process is denoted by X (t).
Sample function x(t, s1 ) is one particular realization (outcome s1 ) of
this process.

The ensemble of a stochastic process is the set of all possible time


functions that can result from an experiment.
5. stochastic processes 3 / 46
Problem 13.1.3
Consider the transmission of 3 bits in a BPSK system (binary phase
shift keying):
S ∈ {000, 001, 010, 011, 100, 101, 110, 111}

The ensemble consists of 8 possible sample functions:

5. stochastic processes 4 / 46
Example: speech

Experiment ‘s’: pronounce ‘ssss’

Sample function: one realization of the waveform x(t, s)

5. stochastic processes 5 / 46
Example: arrival times of data packets

5. stochastic processes 6 / 46
Example: binary bit pattern

A stochastic description can be used for image compression.

5. stochastic processes 7 / 46
Example: handwritten digits

A stochastic description (features!) is used for pattern recognition.

5. stochastic processes 8 / 46
Example: linear filtering of a stochastic process

How can we describe Y (t) when X (t) is a stochastic process?

Statistical descriptions of X (t): Statistical descriptions of Y (t):


R
mean µX mean µY = µX t h(t)dt
Autocorrelation function RX (τ ). RY (τ ) = h(τ ) ∗ h(−τ ) ∗ RX (τ ).

This will be the topic of the Supplement (next weeks)

5. stochastic processes 9 / 46
Description of a random process

How to describe a stochastic process at one time instance: e.g.,


X (t1 )?
How to describe a stochastic process at multiple time instances: e.g.,
[X (t1 ), X (t2 )]T ?

5. stochastic processes 10 / 46
Description of a random process
Similar as for RVs, we can use the PDF/PMF to describe stochastic
processes:

At any (fixed) time tk the stochastic process can be regarded as a


random variable:
X (tk ) ∼ fXtk (xtk )
This PDF may be different for each tk !

The joint behavior for multiple time instances t, i.e., t1 , · · · , tk is


given by the joint PDF:

[X (t1 ), · · · , X (tk ), · · · ]T ∼ fXt1 ,Xt2 ,··· ,Xtk ,··· (xt1 , xt2 , · · · , xtk , · · ·)

5. stochastic processes 11 / 46
Example: rectified sinusoid with random amplitude

Let X (t) = R| cos(ωt)| with


(
1 −r /10
10 e r ≥0
fR (r ) =
0 otherwise

Calculate fX (t) (x)!

Approach

First calculate the CDF FX (t) (x)


d
Then calculate the PDF fX (t) (x) = dx FX (t) (x)

5. stochastic processes 12 / 46
Example: rectified sinusoid with random amplitude

FX (t) (x) = P[X (t) ≤ x]


= P[R |cos(ωt)| ≤ x]
= P [R ≤ x/| cos(ωt)|] if cos(ωt) 6= 0
Z x/| cos(ωt)|
= fR (r ) dr
0
x
− 10| cos(ωt)|
= 1−e if x ≥ 0

If cos(ωt) 6= 0:
(
0 x <0
FX (t) (x) = x
− 10| cos(ωt)|
1−e x ≥ 0.

5. stochastic processes 13 / 46
Example: rectified sinusoid with random amplitude

If cos(ωt) 6= 0:
(
dFX (t) (x) 0 x <0
fX (t) (x) = = 1
x
− 10| cos(ωt)|
dx 10| cos(ωt)| e x ≥ 0.

If cos(ωt) = 0, then X (t) = 0 (constant) and fX (t) (x) = δ(x).

5. stochastic processes 14 / 46
Problem 13.2.1 (similar: 13.2.4)
Let W be an exponential random variable with PDF
(
e −w w ≥0
fW (w ) =
0 otherwise

Find the CDF FX (t) (x) of the time-delayed ramp process X (t) = t − W .

5. stochastic processes 15 / 46
Problem 13.2.1 (similar: 13.2.4)
Let W be an exponential random variable with PDF
(
e −w w ≥0
fW (w ) =
0 otherwise

Find the CDF FX (t) (x) of the time-delayed ramp process X (t) = t − W .

P[X (t) ≤ x] = P[t − W ≤ x] = P[W ≥ t − x] .

Since W ≥ 0, if x ≥ t then P[W ≥ t − x] = 1. When x < t,


Z ∞
P[W ≥ t − x] = fW (w )dw = e −(t−x) .
t−x

5. stochastic processes 15 / 46
Problem 13.2.1 (cont’d)
Combining the facts, we have
(
e −(t−x) x <t
FX (t) (x) = P[W ≥ t − x] =
1 x ≥t

We note that the CDF contains no discontinuities. Taking the


derivative of the CDF with respect to x, we obtain the PDF
(
e −(t−x) x <t
fX (t) (x) =
0 x ≥t

• Treat t as a constant
• Calculate CDF FX (t) (x)
• Then calculate PDF fX (t) (x)

5. stochastic processes 16 / 46
Description of a random process
Notice that

[X (t1 ), · · · , X (tk ), · · · ]T ∼ fXt1 ,Xt2 ,··· ,Xtk ,··· (xt1 , xt2 , · · · , xtk , · · ·)

resembles a vector random variable,

but can be of infinite dimensionality,


and ordering (in time) of the X (tk ) is essential.

Generally, the joint PDF is very difficult to acquire. Exceptions:

Independent identically distributed (iid) random sequence/process


Gaussian stochastic process
Poisson process (skipped), Brownian motion process (skipped)

5. stochastic processes 17 / 46
Independent identically distributed (iid) random sequences
For an iid random sequence

all X (tk ) are mutually independent random variables for all tk ,


all X (tk ) have the same PDF for all tk .

Let Xn denote a (time discrete) iid random sequence with sample vector
X = [Xn1 , · · · , Xnk ]T .

For a discrete valued Xn , the joint PMF is


k
Y
PX (x) = PX1 (x1 ) · · · PXk (xk ) = PX (xi )
i=1

For a continuous valued Xn , the joint PDF is


k
Y
fX (x) = fX1 (x1 ) · · · fXk (xk ) = fX (xi )
i=1
5. stochastic processes 18 / 46
Example: Bernoulli process (time discrete)
One realization of a Bernoulli process (e.g., a bit sequence)

PMF for one time instance k:



p xk = 1 (
p xk (1 − p)1−xk

xk = 0, 1
PXk (xk ) = 1 − p xk = 0 ⇔ PXk (xk ) =
 0 otherwise
0 otherwise

5. stochastic processes 19 / 46
Example: Bernoulli process (cont’d)
(
p xk (1 − p)1−xk xk = 0, 1
PXk (xk ) =
0 otherwise

For two time instances t1 and t2 we obtain (iid process!)

PX1 ,X2 (x1 , x2 ) = PX1 (x1 )PX2 (x2 ) = p x1 (1 − p)1−x1 p x2 (1 − p)1−x2


= p x1 +x2 (1 − p)2−x1 −x2

For k time instances we obtain


k
Y
PX (x) = p xi (1 − p)1−xi = p x1 +···+xk (1 − p)k−(x1 +···+xk )
i=1

5. stochastic processes 20 / 46
Gaussian Process
Gaussian processes occur quite often in nature (remember the central
limit theorem!)
The process X (t) is a Gaussian stochastic process (sequence) if and
only if X = [X (t1 ) · · · X (tk )]T is a Gaussian random vector for any
integer k > 0 and any set of time instances t1 , t2 , · · · , tk .

5. stochastic processes 21 / 46
Gaussian Process
Remember that for Gaussian random vectors X = [X1 , X2 , · · · , XN ]T we
have: h i
exp − 21 (x − E[X ])T CX−1 (x − E[X ])
fX (x) =
(2π)N/2 det(CX )1/2

For a Gaussian stochastic process X (t), the distribution of


X = [Xt1 , Xt2 , ..., Xtk ]T is thus also given by fX (x).

For each collection of sample times t1 , · · · , tk , we need to specify

The mean: µX = E[X ]; specify E[X (ti )] for all i;


The (auto)covariance: cov[X , X ] = CX ;
specify cov[X (ti ), X (tj )] for all i, j.

5. stochastic processes 22 / 46
Expected value of a stochastic process

Expected value of X (tk ) at time tk :


Z ∞
E[X (t)] = x fX (t) (x) dx
−∞

5. stochastic processes 23 / 46
Autocovariance of a stochastic process

cov[X (t1 ), X (t2 )] indicates how much the process is likely to change
from t1 to t2 .
Large covariance: sample function unlikely to change.
Zero covariance: sample function expected to change rapidly.

5. stochastic processes 24 / 46
Autocovariance and autocorrelation
The covariance of a stochastic process at two different time instances is
called “autocovariance”:

CX (t, τ ) = cov[X (t), X (t + τ )]


= E [(X (t) − E[X (t)]) (X (t + τ ) − E[X (t + τ )])]
= E[X (t)X (t + τ )] −E[X (t)]E[X (t + τ )]
| {z }
Similar to crosscorrelation E[XY ]

The autocorrelation is similarly defined as


RX (t, τ ) = E[X (t)X (t + τ )]

CX (t, τ ) = RX (t, τ ) − E [X (t)] E [X (t + τ )]


CX (t, 0) = E[X (t)2 ] − E[X (t)]2 = var[X (t)]

5. stochastic processes 25 / 46
The Autocorrelation function

Notice that exact formulation of the autocorrelation depends on


whether time and amplitudes are discrete or continuous:
ZZ
RX (t, τ ) = E[X (t)X (t + τ )] = x y fX (t),X (t+τ ) (x, y ) dxdy
XX
RX (t, τ ) = E[X (t)X (t + τ )] = xy P[X (t) = x, X (t + τ ) = y ]
x y

ZZ
RX [n, k] = E[Xn Xn+k ] = x y fXn ,Xn+k (x, y ) dxdy
XX
RX [n, k] = E[Xn Xn+k ] = xy P[Xn = x, Xn+k = y ]
x y

5. stochastic processes 26 / 46
Example: sinusoidal process

Random process: X (t) = A sin(ωt + Φ)


Amplitude A and phase Φ are independent random variables, where

A is uniformly distributed on [−1, +1]


Φ is uniformly distributed on [0, 2π]

5. stochastic processes 27 / 46
Example: sinusoidal process

RX (t, τ ) = E[X (t)X (t + τ )]


= E[A2 sin(ωt + Φ) sin(ω(t + τ ) + Φ)]
= E[A2 ] E[sin(ωt + Φ) sin(ω(t + τ ) + Φ)]
Z 1 2
a
= da E[sin(ωt + Φ) sin(ω(t + τ ) + Φ)]
−1 2
1
= E[sin(ωt + Φ) sin(ω(t + τ ) + Φ)]
3
1
= E[cos(ωτ ) − cos(2ωt + ωτ + 2Φ)]
6
1 1
= cos(ωτ ) − E[cos(2ωt + ωτ + 2Φ)]
6 6| {z }
=0
1
= cos(ωτ )
6

5. stochastic processes 28 / 46
Example: sinusoidal process
1
RX (t, τ ) = 6 cos(ωτ )

High correlation for ωτ = 0, ±2π, · · ·


Zero correlation for ωτ = ± 21 π, · · ·
Very negative correlation for ωτ = ±π, · · ·

5. stochastic processes 29 / 46
Uncorrelated and orthogonal processes

If all pairs X (t), X (t + τ ) are uncorrelated, i.e.,


(
var[X (t)] ∀ t and τ = 0
CX (t, τ ) =
0 ∀ t and τ =
6 0

then X (t) is called an uncorrelated process.

If all pairs X (t), X (t + τ ) are orthogonal, i.e.,


(
E[X 2 (t)] ∀ t and τ = 0
RX (t, τ ) =
0 ∀ t and τ =
6 0

then X (t) is called an orthogonal process

5. stochastic processes 30 / 46
Problem 13.7.2
For the time-delayed ramp process X (t) = t − W from Problem 13.2.1,
find for any t ≥ 0:
(a) The expected value µX (t),
(b) The autocovariance function CX (t, τ ).
Hint: E[W ] = 1 and E[W 2 ] = 2.

5. stochastic processes 31 / 46
Problem 13.7.2
For the time-delayed ramp process X (t) = t − W from Problem 13.2.1,
find for any t ≥ 0:
(a) The expected value µX (t),
(b) The autocovariance function CX (t, τ ).
Hint: E[W ] = 1 and E[W 2 ] = 2.

We know but don’t need:


(a) The mean is (
µX (t) = E[t − W ] = t − E[W ] = t − 1. e x−t x < t
fX (t) (x) =
0 o.w.
(b) The autocovariance is

CX (t, τ ) = E[X (t)X (t + τ )] − µX (t)µX (t + τ )


= E[(t − W )(t + τ − W )] − (t − 1)(t + τ − 1)
= t(t + τ ) − E[(2t + τ )W ] + E[W 2 ] − (t − 1)(t + τ − 1)
= −(2t + τ )E[W ] + 2 + 2t + τ − 1 = 1
5. stochastic processes 31 / 46
Stationary Process

A stochastic process is stationary if and only if every joint-PDF is shift


invariant:
fX (t1 ),X (t2 ),··· ,X (tk ) (x1 , x2 , · · · , xk )
= fX (t1 +∆t),X (t2 +∆t),··· ,X (tk +∆t) (x1 , x2 , · · · , xk )

Consequence I The marginal PDF’s are independent of t:

fX (t) (x) = fX (t+∆t) (x) = fX (x)

The marginal PDF’s are identical for all tk !


⇒ Expected value and variance are time independent.

5. stochastic processes 32 / 46
Problem 13.8.2
X = [X1 X2 ]T has expected value E[X ] = 0 and covariance matrix
 
2 1
CX =
1 1

Does there exist a stationary process X (t) and time instances t1 and t2
such that X is actually a pair of observations [X (t1 ) X (t2 )]T of the
process X (t)?

5. stochastic processes 33 / 46
Problem 13.8.2
X = [X1 X2 ]T has expected value E[X ] = 0 and covariance matrix
 
2 1
CX =
1 1

Does there exist a stationary process X (t) and time instances t1 and t2
such that X is actually a pair of observations [X (t1 ) X (t2 )]T of the
process X (t)?

The short answer is No. For the given process X (t),

var[X (t1 )] = C11 = 2, var[X (t2 )] = C22 = 1 .

However, stationarity of X (t) requires var[X (t1 )] = var[X (t2 )], which is
a contradiction.

5. stochastic processes 33 / 46
Stationary process

Consequence II The 2D joint-PDF is shift invariant

fX (t1 ),X (t2 ) (x1 , x2 ) = fX (t1 +∆t),X (t2 +∆t) (x1 , x2 )


= fX (0),X (t2 −t1 ) (x1 , x2 )

⇒ only the “distance” τ between t2 and t1 matters.

RX (t, τ ) = RX (τ )
CX (t, τ ) = CX (τ ) = RX (τ ) − E[X ]2

5. stochastic processes 34 / 46
Stationary process
Examples of stationary processes

iid process; e.g. Bernoulli process


Poisson process (random arrival process, ch. 13.4; we skip this)

Remember the PMF for the Bernoulli stochastic process:


k
Y
PX (x) = p xi (1 − p)1−xi = p x1 +···+xk (1 − p)k−(x1 +···+xk )
i=1

which does not depend on the actual time.

Non-stationary processes are difficult to model and to handle in practice.

5. stochastic processes 35 / 46
Problem 13.8.4
Let X (t) be a stationary continuous-time random process. By sampling
X (t) every ∆ seconds, we obtain the discrete-time random sequence
Yn = X (n∆). Is Yn a stationary sequence?

5. stochastic processes 36 / 46
Problem 13.8.4
Let X (t) be a stationary continuous-time random process. By sampling
X (t) every ∆ seconds, we obtain the discrete-time random sequence
Yn = X (n∆). Is Yn a stationary sequence?

Since Yni +k = X ((ni + k)∆) for a set of time samples n1 , · · · , nm


fYn1 +k ,··· ,Ynm +k (y1 , · · · , ym ) = fX ((n1 +k)∆),··· ,X ((nm +k)∆) (y1 , · · · , ym ) .

Since X (t) is a stationary process,


fX ((n1 +k)∆),··· ,X ((nm +k)∆) (y1 , · · · , ym ) = fX (n1 ∆),··· ,X (nm ∆) (y1 , · · · , ym ) .

Since X (ni ∆) = Yni , we see that


fYn1 +k ,··· ,Ynm +k (y1 , · · · , ym ) = fYn1 ,··· ,Ynm (y1 , · · · , ym )
Hence, Yn is a stationary sequence.
5. stochastic processes 36 / 46
Wide-Sense Stationary (WSS) Processes

To show that a process is stationary, we need the overall joint-PDF.

– Quite impossible to get, except for special cases.

However, we can often estimate the process’


– Expected value
– Autocorrelation function

If only the expected value and the autocorrelation function satisfy the
property of stationarity, we call this process wide sense stationary
(WSS).

– Hence, we don’t know anything about other properties of the


process!

5. stochastic processes 37 / 46
Wide-Sense Stationary (WSS) Processes
A process is wide-sense stationary, if and only if
– The expected value E[X (t)] does not depend on time:
E[X (t)] = c.
– The autocorrelation function only depends on the time difference
τ and not the absolute time t:
RX (t, τ ) = RX (τ )
or
RX [n, k] = RX (k)

Example: sinusoidal random process, X (t) = sin(ωt + Φ) where Φ is


uniformly distributed on [0, 2π]: derive that
1
CX (τ ) = RX (τ ) = cos(ωτ )
2
5. stochastic processes 38 / 46
Problem 13.9.3 (important for later!)

True or False: If Xn is a wide sense stationary random sequence with


E[Xn ] = c, then Yn = Xn − Xn−1 is a wide sense stationary random
sequence.

5. stochastic processes 39 / 46
Problem 13.9.3 (important for later!)

True or False: If Xn is a wide sense stationary random sequence with


E[Xn ] = c, then Yn = Xn − Xn−1 is a wide sense stationary random
sequence.

True: First we observe that E[Yn ] = E[Xn ] − E[Xn−1 ] = 0, which does


not depend on n. Second, we verify that

RY [n, k] = E[Yn Yn+k ]


= E[(Xn − Xn−1 )(Xn+k − Xn+k−1 )]
= E[Xn Xn+k ] − E[Xn Xn+k−1 ] − E[Xn−1 Xn+k ] + E[Xn−1 Xn+k−1 ]
= RX [k] − RX [k − 1] − RX [k + 1] + RX [k] ,

which does not depend on n. Hence, Yn is WSS.

5. stochastic processes 39 / 46
Problem 13.9.6
X (t) and Y (t) are independent wide sense stationary processes.
Determine if W (t) = X (t)Y (t) is wide-sense stationary.

5. stochastic processes 40 / 46
Problem 13.9.6
X (t) and Y (t) are independent wide sense stationary processes.
Determine if W (t) = X (t)Y (t) is wide-sense stationary.
True: Independence of X (t) and Y (t) implies

E[W (t)] = E[X (t)Y (t)] = E[X (t)] E[Y (t)] = µX µY

and

RW (t, τ ) = E[W (t)W (t + τ )]


= E[X (t)Y (t) X (t + τ )Y (t + τ )]
= E[X (t)X (t + τ ) Y (t)Y (t + τ )]
= E[X (t)X (t + τ )] E[Y (t)Y (t + τ )] (by independence)
= RX (τ )RY (τ )

Since W (t) has constant expected value and the autocorrelation


depends only on the time difference τ , W (t) is wide-sense stationary.
5. stochastic processes 40 / 46
WSS Processes and the autocorrelation function
Important properties of RX (τ ) for WSS processes:

RX (0) ≥ 0
RX (τ ) = RX (−τ )
RX (0) ≥ |RX (τ )|

Example:
1
RX (t, τ ) = cos(ωτ )
6

5. stochastic processes 41 / 46
Cross correlation for Stochastic Processes
In addition to the autocorrelation function, we can also define the
cross-correlation between two stochastic processes:
RXY (t, τ ) = E[X (t)Y (t + τ )]
RXY [n, k] = E[Xn Yn+k ]

Two random processes X (t) and Y (t) are jointly wide sense
stationary, if X (t) and Y (t) are wide sense stationary and
RXY (t, τ ) = RXY (τ ).

If X (t) and Y (t) are jointly WSS, then


RXY (τ ) = RYX (−τ ).
(and of course similar for time-discrete processes)

5. stochastic processes 42 / 46
Example
Xn is a zero mean WSS stochastic process. Let Yn = (−1)n Xn .

E[Yn ] = (−1)n E[Xn ] = 0, since Xn is zero mean.


RY [n, k] = E[Yn Yn+k ]
= E[(−1)n Xn (−1)n+k Xn+k ]
= (−1)2n+k E[Xn Xn+k ] = (−1)k RX [k]
Process Yn is WSS as RY [n, k] only depends on k.

RXY [n, k] = E[Xn Yn+k ]


= E[Xn (−1)n+k Xn+k ]
= (−1)n+k E[Xn Xn+k ] = (−1)n+k RX [k]
Xn and Yn are not jointly WSS as their cross-correlation function
depends on both n and k.

5. stochastic processes 43 / 46
Problem 13.10.2
X (t) is a wide sense stationary random process. Let
(a) Y (t) = X (t + a)
Are Y (t) and X (t) jointly wide sense stationary?

5. stochastic processes 44 / 46
Problem 13.10.2
X (t) is a wide sense stationary random process. Let
(a) Y (t) = X (t + a)
Are Y (t) and X (t) jointly wide sense stationary?
Since E[Y (t)] = E[X (t + a)] = µX and
RY (t, τ ) = E[Y (t)Y (t + τ )]
= E[X (t + a)X (t + τ + a)] = RX (τ ) ,
we have verified that Y (t) is wide sense stationary. Next we calculate
the cross correlation:
RXY (t, τ ) = E[X (t)Y (t + τ )]
= E[X (t)X (t + τ + a)] = RX (τ + a) .
Since RXY (t, τ ) depends on the time difference τ but not on the
absolute time t, we conclude that X (t) and Y (t) are jointly wide sense
stationary.
5. stochastic processes 44 / 46
Problem 13.10.2 (cont’d)
Now repeat for
(b) Y (t) = X (at)

5. stochastic processes 45 / 46
Problem 13.10.2 (cont’d)
Now repeat for
(b) Y (t) = X (at)

Since E[Y (t) = E[X (at)] = µX and


RY (t, τ ) = E[Y (t)Y (t + τ )]
= E[X (at)X (a(t + τ ))]
= E[X (at)X (at + aτ )] = RX (aτ ) ,
we have verified that Y (t) is wide sense stationary. Now we calculate
the cross correlation:
RXY (t, τ ) = E[X (t)Y (t + τ )]
= E[X (t)X (a(t + τ ))] = RX ((a − 1)t + aτ ) .
Except for the trivial case where a = 1 and Y (t) = X (t), RXY (t, τ )
depends on both the absolute time t and the time difference τ : we
conclude that X (t) and Y (t) are not jointly wide sense stationary.
5. stochastic processes 45 / 46
To do for this lecture:

Make some selected exercises:


13.1.1, 13.3.1, 13.7.1, 13.7.3, 13.7.5, 13.9.3, 13.9.5, 13.9.7, 13.10.1,
13.10.3

Next lecture, we’ll wrap up Ch.13 (ergodicity), and start with


Supplement Sections 1 and 2.

5. stochastic processes 46 / 46

You might also like