Module07-RandomVariateGenerationSlides 171116
Module07-RandomVariateGenerationSlides 171116
11/16/17
1 / 114
Outline
1 Introduction
2 Inverse Transform Method
3 Cutpoint Method
4 Convolution Method
5 Acceptance-Rejection Method
6 Composition Method
7 Special-Case Techniques
8 Multivariate Normal Distribution
9 Generating Stochastic Processes
2 / 114
Introduction
Outline
1 Introduction
2 Inverse Transform Method
3 Cutpoint Method
4 Convolution Method
5 Acceptance-Rejection Method
6 Composition Method
7 Special-Case Techniques
8 Multivariate Normal Distribution
9 Generating Stochastic Processes
4 / 114
Inverse Transform Method
Proof: Let Y = F (X) and suppose that Y has c.d.f. G(y). Then
5 / 114
Inverse Transform Method
6 / 114
Inverse Transform Method
Let U ∼ U(0, 1). Then F (X) = U means that the random variable
F −1 (U ) has the same distribution as X.
7 / 114
Inverse Transform Method
β
Example: The Weibull distribution, F (x) = 1 − e−(λx) , x > 0.
8 / 114
Inverse Transform Method
The c.d.f. is
(
x2 /2 if 0 ≤ x < 1
F (x) =
1 − (x − 2)2 /2 if 1 ≤ x ≤ 2.
√
If U < 1/2, we solve X 2 /2 = U to get X = 2U .
If U ≥ 1/2, the only root of 1 − (X − 2)2 /2 = U in [1, 2] is
p
X = 2 − 2(1 − U ).
√
Thus, for example, if U = 0.4, we take X = 0.8. 2
U 0.135 − (1 − U )0.135
Z = Φ−1 (U ) ≈ . 2
0.1975
10 / 114
Inverse Transform Method
c0 + c1 t + c2 t2
Z = sign(U − 1/2) t − ,
1 + d1 t + d2 t2 + d3 t3
11 / 114
Inverse Transform Method
12 / 114
Inverse Transform Method
If U ≤ p, take X = 1; otherwise, X = 0. 2
13 / 114
Inverse Transform Method
15 / 114
Inverse Transform Method
But life isn’t always dice tosses. A general way to generate a Geom(p)
is to count the number of trials until Ui ≤ p. For example, if p = 0.3,
then U1 = 0.71, U2 = 0.96, and U3 = 0.12 implies that X = 3. 2
16 / 114
Inverse Transform Method
17 / 114
Inverse Transform Method
e−2 2x
Example: Suppose X ∼ Pois(2), so that f (x) = x! ,
x = 0, 1, 2, . . ..
18 / 114
Inverse Transform Method
Note that F̂n (x) is a step function with jumps of height 1/n (every
time an observation occurs).
19 / 114
Inverse Transform Method
The ARENA functions DISC and CONT can be used to generate RV’s
from the empirical c.d.f.’s of discrete and continuous distributions,
respectively.
20 / 114
Inverse Transform Method
Given that you only have a finite number n of data points, we can turn
the empirical c.d.f. into a continuous RV by using linear interpolation
between the X(i) ’s.
0 if x < X(1)
i−1 x−X(i)
F (x) = n−1 + (n−1)(X(i+1) −X(i) ) if X(i) ≤ x < X(i+1) , ∀i
1 if x ≥ X(n)
21 / 114
Inverse Transform Method
22 / 114
Inverse Transform Method
23 / 114
Inverse Transform Method
24 / 114
Inverse Transform Method
For instance, if U ∈ (0.66, 1.00] (the last two F̂ (x) entries), then
F̂ −1 (1.00) − F̂ −1 (0.66)
X = F̂ −1 (0.66) + (U − 0.66)
1.00 − 0.66
2.0 − 1.5
= 1.5 + (U − 0.66) = 1.5 + 1.471(U − 0.66).
1.00 − 0.66
Thus, e.g., if U = 0.83, then X = 1.75. 2
25 / 114
Inverse Transform Method
To do this in general,
1 Generate U .
2 Find the F̂ (x) interval in which U lies, i.e., i such that
ri < U≤ ri+1 . In the above example,
r1 r2 r3 r4 r5
0 0.31 0.41 0.66 1.0
3 Let xi be the left endpoint of the ith X-interval, and let ai be the
reciprocal of the slope of the ith interval.
x1 x2 x3 x4 x5
0.25 0.50 1.0 1.5 2.0
a1 a2 a3 a4 a5
0.81 5.0 2.0 1.47 –
4 X = xi + ai (U − ri ).
26 / 114
Cutpoint Method
Outline
1 Introduction
2 Inverse Transform Method
3 Cutpoint Method
4 Convolution Method
5 Acceptance-Rejection Method
6 Composition Method
7 Special-Case Techniques
8 Multivariate Normal Distribution
9 Generating Stochastic Processes
27 / 114
Cutpoint Method
Cutpoint Method
Suppose we want to generate from the discrete distribution
P (X = k) = pk , k = a, a + 1, . . . , b
qk = P (X ≤ k), k = a, a + 1, . . . , b.
Algorithm CMSET
j ← 0, k ← a − 1, and A ← 0
While j < m:
While A ≤ j:
k ←k+1
A ← mqk
j ←j+1
Ij ← k
29 / 114
Cutpoint Method
Once the cutpoints are computed, we can use the cutpoint method.
Algorithm CM
Generate U from U(0, 1)
L ← bmU c + 1
X ← IL
While U > qX : X ← X + 1
30 / 114
Cutpoint Method
I2 − I1 + 1 Im+1 − Im + 1
E(Cm ) ≤ + ··· +
m m
b − I1 + m
= .
m
31 / 114
Cutpoint Method
Outline
1 Introduction
2 Inverse Transform Method
3 Cutpoint Method
4 Convolution Method
5 Acceptance-Rejection Method
6 Composition Method
7 Special-Case Techniques
8 Multivariate Normal Distribution
9 Generating Stochastic Processes
33 / 114
Convolution Method
Convolution Method
34 / 114
Convolution Method
Example: Triangular(0,1,2).
35 / 114
Convolution Method
This only takes one natural log evaluation, so it’s pretty efficient. 2
In particular, let’s choose n = 12, and assume that it’s “large.” Then
12
X
Y −6 = Ui − 6 ≈ Nor(0, 1). 2
i=1 36 / 114
Convolution Method
37 / 114
Acceptance-Rejection Method
Outline
1 Introduction
2 Inverse Transform Method
3 Cutpoint Method
4 Convolution Method
5 Acceptance-Rejection Method
6 Composition Method
7 Special-Case Techniques
8 Multivariate Normal Distribution
9 Generating Stochastic Processes
38 / 114
Acceptance-Rejection Method
Acceptance-Rejection Method
39 / 114
Acceptance-Rejection Method
t(x) ≥ f (x), x ∈ R,
and Z ∞ Z ∞
c≡ t(x) dx ≥ f (x) dx = 1,
−∞ −∞
where we assume that c < ∞.
40 / 114
Acceptance-Rejection Method
Algorithm A-R
Repeat
Generate U from U(0, 1)
Generate Y from h(y) (independent of U )
f (Y ) f (Y )
until U ≤ g(Y ) = t(Y ) = c h(Y )
Return X ← Y
41 / 114
Acceptance-Rejection Method
42 / 114
Acceptance-Rejection Method
P (A, Y ≤ x)
P (X ≤ x) = P (Y ≤ x|A) = . (1)
P (A)
Then
43 / 114
Acceptance-Rejection Method
44 / 114
Acceptance-Rejection Method
Letting x → ∞, we have
Z ∞
1 1
P (A) = f (y) dy = . (4)
c −∞ c
45 / 114
Acceptance-Rejection Method
46 / 114
Acceptance-Rejection Method
t(x)
h(x) = = e−x (easy Exp(1) p.d.f.),
c
and
f (x) 2
g(x) = = e−(x−1) /2 . 2
t(x)
48 / 114
Acceptance-Rejection Method
49 / 114
Acceptance-Rejection Method
λβ xβ−1 −(λx)β
f (x) = e , x > 0.
Γ(β)
We’ll split the task of generating gamma RV’s via the A-R algorithm
into two cases depending on the magnitude of the shape parameter:
β < 1 and β ≥ 1. . .
50 / 114
Acceptance-Rejection Method
Algorithm GAM1
b ← (e + β)/e (e is the base of `n)
While (True)
Generate U from U(0, 1); W ← bU
If W < 1
Y ← W 1/β ; Generate V from U(0, 1)
If V ≤ e−Y : Return X = Y /λ
Else
Y ← −`n[(b − W )/β]
Generate V from U(0, 1)
If V ≤ Y β−1 : Return X = Y /λ 51 / 114
Acceptance-Rejection Method
If β ≥ 1, the value of
pc for the following A-R algorithm decreases
from 4/e = 1.47 to 4/π = 1.13 as β increases from 1 to ∞.
Algorithm GAM2
a ← (2β − 1)−1/2 ; b ← β − `n(4); c ← β + a−1 ; d ← 1 + `n(4.5)
While (True)
Generate U1 , U2 from U(0, 1)
V ← a`n[U1 /(1 − U1 )]
Y ← βeV ; Z ← U12 U2
W ← b + cV − Y
If W + d − 4.5Z ≥ 0: Return X = Y /λ
Else
If W ≥ `n(Z): Return X = Y /λ
52 / 114
Acceptance-Rejection Method
λn
P (X = n) = e−λ , n = 0, 1, . . .
n!
53 / 114
Acceptance-Rejection Method
54 / 114
Acceptance-Rejection Method
Algorithm POIS1
a ← e−λ ; p ← 1; X ← −1
Until p < a
Generate U from U(0, 1)
p ← pU ; X ← X + 1
Return X
55 / 114
Acceptance-Rejection Method
Thus, we take X = 3. 2
X −λ
√ ≈ Nor(0, 1).
λ
57 / 114
Acceptance-Rejection Method
58 / 114
Composition Method
Outline
1 Introduction
2 Inverse Transform Method
3 Cutpoint Method
4 Convolution Method
5 Acceptance-Rejection Method
6 Composition Method
7 Special-Case Techniques
8 Multivariate Normal Distribution
9 Generating Stochastic Processes
59 / 114
Composition Method
Composition Method
Idea: Suppose a RV actually comes from two RV’s (sort of on top of
each other). E.g., your plane can leave the airport gate late for two
reasons — air traffic delays and maintenance delays, which compose
the overall delay time.
61 / 114
Composition Method
62 / 114
Composition Method
Then
1 1
F (x) = F1 (x) + F2 (x),
2 2
so that we generate from F1 (x) half the time, and F2 (x) half.
63 / 114
Special-Case Techniques
Outline
1 Introduction
2 Inverse Transform Method
3 Cutpoint Method
4 Convolution Method
5 Acceptance-Rejection Method
6 Composition Method
7 Special-Case Techniques
8 Multivariate Normal Distribution
9 Generating Stochastic Processes
64 / 114
Special-Case Techniques
Special-Case Techniques
65 / 114
Special-Case Techniques
But
χ2 (2) ∼ Exp(1/2). 2
66 / 114
Special-Case Techniques
But
p
−2`n(U1 ) sin(2πU2 )
Z2 /Z1 = p = tan(2πU2 ).
−2`n(U1 ) cos(2πU2 )
Thus, we’ve just proven that
Similarly,
68 / 114
Special-Case Techniques
Order Statistics
69 / 114
Special-Case Techniques
70 / 114
Special-Case Techniques
Other Quickies
χ 2
Pn(n) distribution:
2 2
If Z1 , Z2 , . . . , Zn are i.i.d. Nor(0,1), then
i=1 Zi ∼ χ (n).
71 / 114
Multivariate Normal Distribution
Outline
1 Introduction
2 Inverse Transform Method
3 Cutpoint Method
4 Convolution Method
5 Acceptance-Rejection Method
6 Composition Method
7 Special-Case Techniques
8 Multivariate Normal Distribution
9 Generating Stochastic Processes
72 / 114
Multivariate Normal Distribution
The random vector (X, Y ) has the bivariate normal distribution with
means µX = E[X] and µY = E[Y ], variances σX 2 = Var(X) and
73 / 114
Multivariate Normal Distribution
74 / 114
Multivariate Normal Distribution
(x − µ)T Σ−1 (x − µ)
1
f (x) = exp − , x ∈ Rk .
(2π)k/2 |Σ|1/2 2
75 / 114
Multivariate Normal Distribution
76 / 114
Multivariate Normal Distribution
77 / 114
Multivariate Normal Distribution
Algorithm LTM
For i = 1, . . . , k,
For j = 1, . . . , i − 1,
Pj−1
cij ← σij − `=1 ci` cj` /cjj
cji ← 0
1/2
cii ← σii − i−1 2
P
c
`=1 i`
79 / 114
Multivariate Normal Distribution
3. Return X = (X1 , X2 , . . . , Xk ).
80 / 114
Generating Stochastic Processes
Outline
1 Introduction
2 Inverse Transform Method
3 Cutpoint Method
4 Convolution Method
5 Acceptance-Rejection Method
6 Composition Method
7 Special-Case Techniques
8 Multivariate Normal Distribution
9 Generating Stochastic Processes
81 / 114
Generating Stochastic Processes
82 / 114
Generating Stochastic Processes
Markov Chains
Example: On Monday it’s sunny, on Tues and Weds, it’s rainy, etc.
83 / 114
Generating Stochastic Processes
84 / 114
Generating Stochastic Processes
85 / 114
Generating Stochastic Processes
Let
λ(t) = rate (intensity) function at time t,
N (t) = number of arrivals during [0, t].
Then Z b
N (b) − N (a) ∼ Poisson λ(t) dt .
a
87 / 114
Generating Stochastic Processes
Example: Suppose that the arrival pattern to the Waffle House over a
certain time period is a NHPP with λ(t) = t2 . Find the probability
that there will be exactly 4 arrivals between times t = 1 and 2.
Thus,
e−7/3 (7/3)4
2
P N (2) − N (1) = 4 = = 0.120.
4!
88 / 114
Generating Stochastic Processes
Incorrect NHPP Algorithm [it can “skip” intervals with large λ(t)]
T0 ← 0; i ← 0
Repeat
i←i+1
Don’t use this algorithm! — the arrival rate λ(Ti ) doesn’t keep pace
with changes in λ(t) that might occur between the current arrival at
time Ti and the next arrival at time Ti+1 .
89 / 114
Generating Stochastic Processes
90 / 114
Generating Stochastic Processes
Let Ti denote the ith arrival that we actually keep. E.g., if we reject
the first potential arrival but keep the second, then T1 ← t2 .
91 / 114
Generating Stochastic Processes
Thinning Algorithm
T0 ← 0; i ← 0
Repeat
t ← Ti
Repeat
Generate U , V from U(0, 1)
1
t←t− λ? `n(U )
until V ≤ λ(t)/λ?
i←i+1
Ti ← t
92 / 114
Generating Stochastic Processes
The t-values of the blue dots represent actual arrival times; and the
y-values are placed to illustrate the values of λ(t) at those points.
93 / 114
Generating Stochastic Processes
That’s because such a schedule gives λ(t) as a step function that only
changes occasionally, say, every hour. In this case, it’s possible to take
clever advantage the exponential’s memoryless property to avoid the
use of thinning.
94 / 114
Generating Stochastic Processes
Yi = εi + θεi−1 , for i = 1, 2, . . .,
where θ is a constant and the εi ’s are i.i.d. Nor(0, 1) RV’s that are
independent of Y0 .
95 / 114
Generating Stochastic Processes
96 / 114
Generating Stochastic Processes
Yi = φYi−1 + εi , for i = 1, 2, . . .,
97 / 114
Generating Stochastic Processes
As defined, the Yi ’s are all Nor(0,1), but (similar to the MA(1)), they
aren’t independent.
98 / 114
Generating Stochastic Processes
99 / 114
Generating Stochastic Processes
ARMA(p, q) Process
where the φj ’s, θj ’s, and Var(εi ), as well as the initial RVs
Y0 , Y−1 , . . . , Y1−p , are chosen so as to assure that the process doesn’t
explode.
100 / 114
Generating Stochastic Processes
The EAR(1) has the same covariance structure as the AR(1), except
that 0 ≤ φ < 1, that is, Cov(Yi , Yi+k ) = φ|k| for all
k = 0, ±1, ±2, . . ..
101 / 114
Generating Stochastic Processes
102 / 114
Generating Stochastic Processes
Now let’s see how to generate a series of correlated Pareto RV’s. First
of all, a RV X has the Pareto distribution with parameters λ > 0 and
β > 0 if it has c.d.f.
βλ
E[X] = for β > 1 and
β−1
βλ2
Var(X) = for β > 2.
(β − 1)2 (β − 2)
103 / 114
Generating Stochastic Processes
In order to obtain the ARP process, let’s start off with a regular AR(1)
with normal noise,
Yi = ρYi−1 + εi , for i = 1, 2, . . .,
Feed this process into the Nor(0,1) c.d.f. Φ(·) to obtain correlated
Unif(0,1) RV’s, Ui = Φ(Yi ), i = 1, 2, . . ..
Now feed the correlated Ui ’s into the inverse of the Pareto c.d.f. to
obtain correlated Pareto RV’s:
λ
Xi = FX−1 (Ui ) = FX−1 (Φ(Yi )) = , i = 1, 2, . . . .
[1 − Φ(Yi )]1/β
104 / 114
Generating Stochastic Processes
M/M/1 Queue
Let Ii+1 denote the interarrival time between the ith and (i + 1)st
customers; let Si be the ith customer’s service time; and let WiQ
denote the ith customer’s wait before service.
Lindley gives a very nice way to generate a series of waiting times for
this simple example:
Q
Wi+1 = max{WiQ + Si − Ii+1 , 0}.
Brownian Motion
106 / 114
Generating Stochastic Processes
107 / 114
Generating Stochastic Processes
108 / 114
Generating Stochastic Processes
Remark: The regular CLT is just for the case t = 1, so that the thing
converges to W(1) ∼ Nor(0, 1).
One choice that works well is to take Yi = ±1, each with probability
1/2. Take n at least 100, t = 1/n, 2/n, . . . , n/n, and calculate
W(1/n), W(2/n), . . . , W(n/n).
Exercise: Let’s construct some BM! First, pick some “large” value of
n and start with W(0) = 0. Then
i i − 1 Yi
W = W +√ , i = 1, 2, . . . .
n n n
110 / 114
Generating Stochastic Processes
111 / 114
Generating Stochastic Processes
112 / 114
Generating Stochastic Processes
Exercise: Let’s estimate the value E[C] of a stock option. Pick your
favorite values of r, σ, T , k, and off you go!
But there are other ways: You can just simulate the distribution of
S(T ) directly (it’s lognormal), or you can actually look up the exact
“Black–Scholes” answer (see below).
113 / 114
Generating Stochastic Processes