0% found this document useful (0 votes)
2 views

Lec9 Post Draft

The document outlines Lecture 9 of a Mathematical Statistics course, focusing on Chebyshev's Inequality and moment generating functions (mgf). It provides definitions, theorems, and examples related to expected values, variance, and the properties of mgf. The lecture concludes with the uniqueness of mgf and its implications for random variables.

Uploaded by

Judy Cheng
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Lec9 Post Draft

The document outlines Lecture 9 of a Mathematical Statistics course, focusing on Chebyshev's Inequality and moment generating functions (mgf). It provides definitions, theorems, and examples related to expected values, variance, and the properties of mgf. The lecture concludes with the uniqueness of mgf and its implications for random variables.

Uploaded by

Judy Cheng
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

STAT 330: Mathematical Statistics

Spring 2022

Erik Hintz
Institute of Statistics and Actuarial Science
[email protected]

Lecture 9

1 / 21
Today’s Agenda

Last time:

Expected value and properties


Markov Inequality

Today (Lec 9, 05/20):

Chebyshev Inequality
Moment generating functions
After today’s lecture, we have finished Chapter 2. Please review 2.11
(Calculus Review)

2 / 21
Expectation

Definition: Let X be a random variable with pdf/pmf f , support A and


let h : R → R a function. Then the expected value of h (X ) is defined by

 ∑ h (x )f (x )
 if X discrete,
E (h (X )) = xR∈A
 h (x )f (x ) dx if X continuous,

A

provided the expressions converge absolutely, i.e., if E (|h (x )|) < ∞.


Otherwise we say that E (h (X )) does not exist.

Suppose X , Y are random variables, a, b ∈ R constant and h, g real


valued functions. Then

E (ag (X ) + bh (Y )) = aE (g (X )) + bE (h (Y )),

provided the expression exists.

3 / 21
Easy bounds for probabilities: Markov/Chebyshev Style
Inequalities
Theorem: Let X be a random variable and k, c > 0 constant. Then

E (|X |k )
P (|X | ≥ c ) ≤ .
ck

4 / 21
Example

A post office handles, on average, 10,000 letters each and every day.
What can be said about the probability that it will handle at least 15,000
letters tomorrow?

5 / 21
Chebyshev’s inequality

Theorem: Suppose X is a random variable with finite µ = E (X ) and


σ2 = Var(X ). Then, for any k > 0,

1
P (|X − µ| ≥ kσ) ≤ 2 .
k
Proof: Exercise.

Remark:
p
If Var(X ) exists, we call Var(X ) the standard deviation of X .
By Chebyshev’s inequality, the probability that X is more than k
standard deviations away from its mean is at most 1/k 2 .

6 / 21
Variance and other moments

We call Var (X ) = E ((X − µ)2 ) the variance of X , where


µ = E (X ), provided the expression exists.
You show Var (X ) = E (X 2 ) − E (X )2 , which implies that Var (X )
exists when E (X 2 ) exists.
Please read 2.7.6 about special expectations

7 / 21
Moment Generating Function (mgf)

The moment generating function is a particular expectation which, if


exists, uniquely defines the distribution of a random variable.

Definition: If X is a random variable then, MX (t ) = E (e tX ) is called


the moment generating function (m.g.f.) of X provided this expectation
exists for all t ∈ (−h, h ) for some h > 0.

The mgf is a function of t.


When determining the mgf of a random variable, the values of t for
which the mgf exists must always be stated.

8 / 21
Example
Example 2.26 (2.10.2 (a) of the course notes): Find the moment
generating function of X ∼ Γ(α, β). Make sure you specify the domain
on which the moment generating function is defined.

9 / 21
10 / 21
Example
Example 2.27: Find the moment generating function of X ∼ Poi (λ).
Make sure you specify the domain on which the moment generating
function is defined.

11 / 21
12 / 21
Properties of the mgf
a) MX (0) = 1
(k )
b) Suppose the derivatives MX (t ), k = 1, 2, . . . exist for t ∈ (−h, h )
for some h > 0, then its Maclaurin series of MX (t ) is

∞ (k )
MX (t )
MX ( t ) = ∑ k!
t =0 k
t
k =0

c) If the mgf exists, then

d k MX (t )
E (X k ) =
dt k t =0

d) Putting (2) and (3) together, we have



E (X k ) k
MX (t ) = ∑ k!
t
k =0

13 / 21
Example
Example 2.28: A discrete random variable X has the pmf
1  x +1
f (x ) = 1{0,1,2,...}
2
Derive the mgf of X and use it to calculate its mean and variance.

14 / 21
15 / 21
mgf of a linear transformation
Theorem: Suppose the random variable X has mgf MX (t ) defined for
t ∈ (−h, h ) for some h > 0. Let Y = aX + b where a, b ∈ R and a 6= 0.
Then the mgf of Y is
h
MY (t ) = e bt MX (at ), |t | < .
|a |

16 / 21
Uniqueness of the mgf

Theorem: Suppose the random variable X has m.g.f. MX (t ) and the


random variable Y has m.g.f. MY (t ). Suppose also that
MX (t ) = MY (t ) for all t ∈ (−h, h ) for some h > 0. Then X and Y have
the same distribution, that is,

P ( X ≤ s ) = FX ( s ) = FY ( s ) = P ( Y ≤ s ) ∀s ∈ R

Takeaway:
To show that X and Y have the same distribution, if X and Y both
have a mgf, all you need to do is to check MX (t ) = MY (t ).

17 / 21
Example

Example 2.29: Consider X ∼ U(θ1 , θ2 ). Find the mgf of the random


variable Y = 5X + 3.

18 / 21
19 / 21

You might also like