Lec9 Post Draft
Lec9 Post Draft
Spring 2022
Erik Hintz
Institute of Statistics and Actuarial Science
[email protected]
Lecture 9
1 / 21
Today’s Agenda
Last time:
Chebyshev Inequality
Moment generating functions
After today’s lecture, we have finished Chapter 2. Please review 2.11
(Calculus Review)
2 / 21
Expectation
E (ag (X ) + bh (Y )) = aE (g (X )) + bE (h (Y )),
3 / 21
Easy bounds for probabilities: Markov/Chebyshev Style
Inequalities
Theorem: Let X be a random variable and k, c > 0 constant. Then
E (|X |k )
P (|X | ≥ c ) ≤ .
ck
4 / 21
Example
A post office handles, on average, 10,000 letters each and every day.
What can be said about the probability that it will handle at least 15,000
letters tomorrow?
5 / 21
Chebyshev’s inequality
1
P (|X − µ| ≥ kσ) ≤ 2 .
k
Proof: Exercise.
Remark:
p
If Var(X ) exists, we call Var(X ) the standard deviation of X .
By Chebyshev’s inequality, the probability that X is more than k
standard deviations away from its mean is at most 1/k 2 .
6 / 21
Variance and other moments
7 / 21
Moment Generating Function (mgf)
8 / 21
Example
Example 2.26 (2.10.2 (a) of the course notes): Find the moment
generating function of X ∼ Γ(α, β). Make sure you specify the domain
on which the moment generating function is defined.
9 / 21
10 / 21
Example
Example 2.27: Find the moment generating function of X ∼ Poi (λ).
Make sure you specify the domain on which the moment generating
function is defined.
11 / 21
12 / 21
Properties of the mgf
a) MX (0) = 1
(k )
b) Suppose the derivatives MX (t ), k = 1, 2, . . . exist for t ∈ (−h, h )
for some h > 0, then its Maclaurin series of MX (t ) is
∞ (k )
MX (t )
MX ( t ) = ∑ k!
t =0 k
t
k =0
d k MX (t )
E (X k ) =
dt k t =0
13 / 21
Example
Example 2.28: A discrete random variable X has the pmf
1 x +1
f (x ) = 1{0,1,2,...}
2
Derive the mgf of X and use it to calculate its mean and variance.
14 / 21
15 / 21
mgf of a linear transformation
Theorem: Suppose the random variable X has mgf MX (t ) defined for
t ∈ (−h, h ) for some h > 0. Let Y = aX + b where a, b ∈ R and a 6= 0.
Then the mgf of Y is
h
MY (t ) = e bt MX (at ), |t | < .
|a |
16 / 21
Uniqueness of the mgf
P ( X ≤ s ) = FX ( s ) = FY ( s ) = P ( Y ≤ s ) ∀s ∈ R
Takeaway:
To show that X and Y have the same distribution, if X and Y both
have a mgf, all you need to do is to check MX (t ) = MY (t ).
17 / 21
Example
18 / 21
19 / 21