Lecturenotes4 10
Lecturenotes4 10
The variance of discrete random variable X with range R and pmf f (x) is, provided
series converges absolutely,
X
σ 2 = V ar(X) = (x − µ)2 f (x) = E[(X − µ)2 ]
x∈R
X
= x2 f (x) − µ2 = E(X 2 ) − [E(X)]2 = E(X 2 ) − µ2 ,
x∈R
x 0 2 4 6 8 10
f (x) 0.17 0.21 0.18 0.11 0.16 0.17
(a) Calculating the expected value. The expected value (mean) number of
seizures is given by
X
E(X) = xf (x) = 0(0.17)+2(0.21)+4(0.18)+6(0.11)+8(0.16)+10(0.17) =
x
0 2 4 6 8 10 0 2 4 6 8 10 0 2 4 6 8 10
[1] 4.78
σ 2 = V ar[X] = E (X − µ)2
X
= (X − µ)2 f (x)
x
= (0 − 4.78)2 (0.17) + (2 − 4.78)2 (0.21) + · · · + (10 − 4.78)2 (0.17) ≈
[1] 12.0716
(circle one) (i) 3.47 (ii) 4.11 (iii) 5.07 (iv) 6.25.
SDX <- sqrt(VarX); SDX # standard deviation
[1] 3.474421
0 2 4 6 8 10 0 2 4 6 8 10 0 2 4 6 8 10
(a) seizure distribution (b) another distribution (c) and another distribution
2. Variance and standard deviation: rolling a pair of dice. If the dice are fair, the
distribution of X (the sum of two rolls of a pair of dice) is
x 2 3 4 5 6 7 8 9 10 11 12
1 2 3 4 5 6 5 4 3 2 1
f (x) 36 36 36 36 36 36 36 36 36 36 36
(b) If g(x) = x2 ,
2
X
2 2 1 2 1
E(X ) = x f (x) = 2 + · · · + 12 =
x
36 36
(c) Variance.
σ 2 = V [X] = E (X − µ)2 = E X 2 − µ2 = 54.83 − 72 ≈
[1] 5.833333
[1] 2.415229
x 1 2 3 4 5 6
1 1 1 1 1 1
f (x) 6 6 6 6 6 6
k=
[1] 3.5
4. Another die question. Fair six-sided die is labelled in one of three ways: there
are two sides labelled 1, three sides labelled 2 and one side labelled 3. If it costs
$1 to play and you win $1 × result from die, what is the expected value of this
game?
die 1 2 3
x, payoff 1 − 1 2−1 3−1
2 3 1
f (x) 6 6 6
[1] 5/6
x 0 1 2 3 4
f (x) 0.310 0.058 0.005 0.000
[1] 0.6274224
σ 2 = V [X] = E (X − µ)2
X
= (X − µ)2 f (x)
x
= (0 − 0.44)2 (0.627) + (1 − 0.44)2 (0.0.310) + · · · + (4 − 0.44)2 (0.000) ≈
x 0 1
f (x) 1−p p
7. Poisson: accidents. An average of λ = 3 accidents per year occurs along the I–95
stretch of highway between Michigan City, Indiana, and St. Joseph, Michigan.
1. Functions of random value: seizures. The pmf for the number of seizures, X,
of a typical epileptic person in any given year is given in the following table.
x 0 2 4 6 8 10
f (x) 0.17 0.21 0.18 0.11 0.16 0.17
[1] 4.78
64 Chapter 2. Discrete Random Variables (LECTURE NOTES 4)
(b) If the medical costs for each seizure, X, is $200; in other words, function
u(x) = 200x, the probability distribution for u(X) is:
x 0 2 4 6 8 10
u(x) = 200x 200(0) = 0 200(2) = 400 800 1200 1600 2000
p(u(x)) 0.17 0.21 0.18 0.11 0.16 0.17
The expected value (mean) cost of seizures is then given by
X
E[u(X)] = E[200X] = (200x)f (x) = [0](0.17)+[400](0.21)+· · ·+[2000](0.17) =
x
[1] 956
[1] 956
(c) If the medical costs for each seizure, X, is given by function u(x) = x2 ,
x 0 2 4 6 8 10
u(x) = x2 02 = 0 4 16 36 64 100
p(u(x)) 0.17 0.21 0.18 0.11 0.16 0.17
The expected value (mean) cost of seizures in this case is given by
X
E[u(X)] = E[X 2 ] = x2 f (x) = [0](0.17) + [4](0.21) + · · · + [100](0.17) =
x
[1] 34.92
E[u(X)] = E[200X 2 + X − 5]
= E(200X 2 ) + E(X) − E(5)
= 200E(X 2 ) + E(X) − E(5)
= 200(34.92) + 4.78 − 5 =
(i) 4320.67 (ii) 5780.11 (iii) 6983.78 (iv) 8480.99.
Section 6. Functions of a Random Variable (LECTURE NOTES 4) 65
[1] 6983.78
2. More functions of random variable: flipping until a head comes up. A (weighted)
coin has a probability of p = 0.7 of coming up heads (and so a probability of
1 − p = 0.3 of coming up tails). This coin is flipped until a head comes up or
until a total of 4 flips are made. Let X be the number of flips. Recall,
x 1 2 3 4
f (x) 0.700 0.210 0.063 0.027
(a) If u(X) = x,
X
µ = E(X) = xf (x) = 1(0.700) + 2(0.210) = 3(0.063) + 4(0.027) =
x
[1] 1.417
(b) If u(X) = x1 ,
X
1 1 1 1 1 1
E = f (x) = (0.7) + (0.21) + (0.063) + (0.027) =
X x
x 1 2 3 4
[1] 0.83275
200 1
(c) If u(X) = x
+ 200x
+ 5,
200 1
E[u(X)] = E + +5
X 200X
1 1
= E 200 + +5
200 X
1 1
= 200 + E + E[5]
200 X
1
= 200 + (0.83275) + 5 =
200
(i) 43.20 (ii) 57.80 (iii) 109.35 (iv) 171.55.
66 Chapter 2. Discrete Random Variables (LECTURE NOTES 4)
[1] 171.5542
[1] 2.5
(b) If u(x) = x2 ,
1 1 1 1 30
E[X 2 ] = 12 · + 2 2 · + 32 · + 4 2 · = =
4 4 4 4 4
(i) 6.75 (ii) 7.00 (iii) 7.25 (iv) 7.50
EX2 <- sum(x^2*px); EX2 # E(X^2)
[1] 7.5
and also
V ar(X) = E(X 2 ) − µ2 = 7.5 − 2.52 =
(i) 1.25 (ii) 1.50 (iii) 1.75 (iv) 2.50
VarX <- EX2 - EX^2; VarX # variance
[1] 1.25
[1] 5
and also
V ar[2X] = 22 V ar(X) = 4(1.25) =
(i) 5 (ii) 6 (iii) 7 (iv) 8
VarU <- 2^2*VarX; VarU # Var(U)
Section 6. Functions of a Random Variable (LECTURE NOTES 4) 67
[1] 5
2
4. Random variable X has mean µX = µ, and variance σX = σ 2 . If u(x) = 3x + 4,
then
µU = E[3X + 4] = 3E(X) + 4 =
(i) µ + 4 (ii) 2µ + 4 (iii) 3µ + 4 (iv) 4µ + 4
and also
σU2 = V ar[3X + 4] = 32 V ar(X) + 0 =
(i) 8σ 2 + 4 (ii) 8σ 2 (iii) 9σ 2 (iv) 10σ 2
5. Consider random variable X where E[X + 2] = 4 and E[X 2 + 4X] = 3, then
E[X + 2] = E(x) + 2 = 4,
so µ = E(X) = (i) 1 (ii) 2 (iii) 3 (iv) 4
Since
µ = E[X] = λ =
(i) 1 (ii) 2 (iii) 3 (iv) 4
so
E[X 2 − 3X] = E(X 2 ) − 3E(X) = 12 − 3(3) =
(i) 3 (ii) 4 (iii) 5 (iv) 6
68 Chapter 2. Discrete Random Variables (LECTURE NOTES 4)
Furthermore, if random variable X and its mgf M (t) exists for all t in an open interval
containing 0, then
• M (t) uniquely determines the distribution of X,
• M 0 (0) = E(X), M 00 (0) = E(X 2 ).
Also, if Y = aX + b,
DISCRETE f(x)
M (t) µ σ2
n
Binomial px q n−x (pet + q)n np npq
x
x
λ(et −1)
Poisson e−λ λx! e λ λ
pet
Geometric q x−1 p 1−qet
1/p q/p2
x−1
pet
r
Negative Binomial pr q x−r 1−qet
r/p rq/p2
r−1
t t t
(i) 13 et + 21 e2t + 16 e3t (ii) 12 et + 13 e2t + 16 e3t (iii) e 2 + 2e 3 + 3e 3
Section 6. The Moment-Generating Function (LECTURE NOTES 4) 69
(b) pmf B
x 1 2 3
1 1 1
f (x) 2 3 6
(c) pmf C
1 1 1
x 2 3 6
f (x) 1 2 3
3. Expected value using mgf. What is the expected value of
1 1 1
M (t) = et + e2t + e3t ?
2 3 6
On the one hand, since M (t) is equivalent to
x 1 2 3
1 1 1
f (x) 2 3 6
X 1 1 1
E(X) = xf (x) = 1 · +2· +3· =
x
2 3 6
3 4 5 6
(i) 3
(ii) 3
(iii) 3
(iv) 3
x 1 2 3
1 1 1
f (x) 2 3 6
2 2 2
X
2 5 1 5 1 5 1
V ar(X) = (x − µ) f (x) = 1 − · + 2− · + 3− · =
x
3 2 3 3 3 6
3 4 5 6
(i) 9
(ii) 9
(iii) 9
(iv) 9
so 2
102 2 5
V ar(X) = E(X ) − µ = − =
3 3
3 4 5 6
(i) 9
(ii) 9
(iii) 9
(iv) 9
5. Binomial mgf. With some effort, it can be shown mgf for binomial is
n
tX
X
tx n n
M (t) = E[e ] = e px q n−x = pet + q .
x
x=0
Section 6. The Moment-Generating Function (LECTURE NOTES 4) 71
6. Identify binomial pmf with mgf. What is the pmf of random variable X with
11
M (t) = 0.3et + 0.7 ?
Since 11
(pet + q)n = 0.3et + 0.7 ,
where p = 0.3, q = 0.7 and n = 11, this is a binomial distribution b(n, p) =
(i) b(11, 0.3) (ii) b(0.3, 11) (iii) b(11, 0.7) (iv) b(0.7, 11).
0.3et
M (t) = ?
1 − 0.7et
Since, from the table above,
pet 0.3et
= ,
1 − qet 1 − 0.7et
this is a geometric distribution where