Moment Generating Function
Moment Generating Function
for
y 0 1 2 , , ... ., where represents the rate at which something happens. Find the moment
generating function and use it to find the mean and variance of Y .
Solution:
First find m t E e
e e
y
Y
tY
ty y
y
( ) ( )
!
e
e
y
e e
t y
y
e
t
( )
!
0
since
( )
!
e
y
t y
y
0
is the power series for e
e
t
e e
e e
t t
( ) 1
m t e
Y
e
t
( )
( ) 1
Find derivatives of m t
Y
( ) to use for computing and
2
.
m t e e
e t
t
( )
( )
1
+
m t e e e e e
e t t e t
t t
( )
( ) ( )
1 1
E Y m e e ( ) ( )
0
0 0
E Y m e e e e e ( ) ( )
2 0 0 0 0 0 2
0 + +
V Y E Y E Y ( ) ( ) [ ( )]
2 2 2
+ ( )
2 2
The Poisson distribution has mean and variance of .
Example 2
Suppose Y has a geometric distribution with parameter p . Show that the moment
generating function for Y is m t
pe
qe
t
t
( )
1
where q p 1 .
NCSSM Statistics Leadership Institute Notes The Theory of Inference
12
Solution:
( )
1
1
1
1
1
1
( ) 1, 2,...
( ) ( )
( ) Note: geometric series with common ratio 1if ln
1 1
y
tY ty y
Y
y
t y t
y
t t
t t
p y q p y
m t E e e pq
pq e q e q t q
qe pe
pq
qe qe
< <
Now that we know the moment generating function, it is a simple matter to find the mean
and variance of the geometric distribution.
For the mean we have
( )
( ) ( )
( ) ( )
2 2
1
1 1
t t t t
t
t t
qe pe pe qe
pe
m t
qe qe
, so ( )
( )
0
2 2
0
1
0
1
pe p
m
p p
qe
.
For the variance we have
( )
( ) ( )( )( )
( )
( )
( )
2
4 3
1 2 1 1
1 1
t t t t t t t
t t
qe pe pe qe qe pe qe
m t
qe qe
+
and
( )
( )
( )
( )
0 0
3 3 2
0
1
1
1
0
1
pe qe
p q
q
m
p p
qe
+
+
+
,
so ( ) ( ) ( ) ( )
2
2 2 2
1 1 1
0 0
q p
Var Y m m
p p p
+
.
Example 3
Find the moment generating function for a random variable with a standard normal
distribution. That is, find the moment generating function for Z N ~ ( , ) 0 1 . Note that
Z N ~ ( , ) 0 1 is read: Z is distributed as a normal random variable with 0 and
2
1 .
Solution:
m t E e e e dz
Z
tZ tz
z
( ) ( )
z
1
2
2
2
+ +
F
H
G
I
K
J
z
1
2
2 2 2
2
2
2 2 2
e dz
z tz t t
NCSSM Statistics Leadership Institute Notes The Theory of Inference
13
z
e e dz
t z t
2 2
2 2
1
2
( )
z
e normal density t dz
t
2
2 2
1 , ,
d i
e e
t t
2 2
2 2
1
It is straight forward to verify that the mean and variance are 0 and 1, respectively.
Example 4
Show that the moment generating function for a random variable Y N ~ ( , )
2
is
2 2
2
( )
t
t
Y
m t e
+
. Use the moment generating function to show that E Y ( ) .
Solution:
Following the outline of Example 3, we have
m t E e e e dy
Y
tY ty
y
( ) ( )
( )
z
1
2
1
2
2
2
+
z
1
2
1
2
2
2
2
2
2
2
e dy
y
ty
( )
Consider the exponent:
1
2
2
2
2 2
[( ) ] y ty
1
2
2
2
2 2 2 4 2
[(( ) ) ] y t t t
So m t e dy
Y
y t t
t
( )
[( ) ]
+ +
z
1
2
1
2 2
2
2 2
2 2
z
e e dy
t
t
y t
2 2
2
2 2
2
1
2
1
2
[( ) ]
(the integrand is a normal density function)
+
+
z
e normal density t dy
t
t
2 2
2
2 2
( , , var ) mean
+ +
e e
t
t
t
t
2 2 2 2
2 2
1
Now to find the expected value of Y :
( )
2 2
2
2
( )
t
t
m t e t
+
+
+ m e ( ) ( ) 0 0
0
.
The variance of Y is found by computing:
( ) ( )
2 2 2 2
2
2 2
2 2
( )
t t
t t
m t e e t
+ +
+ +
NCSSM Statistics Leadership Institute Notes The Theory of Inference
14
( ) ( ) ( )
2
0 2 0 2 2
0 m e e + + .
Then ( )
2 2 2 2
Var y + .
A more direct derivation can also be given:
( )
( )
( )
( )
( )
( )
( )
2 2 2 1 1
2 2
( )
t t t t z t z tY t t t
Y Z
m t E e E e E e e e m t e e e
+ +
.
Example 5
Find the moment generating function for a random variable with a gamma distribution.
Solution:
1 /
0
1
( ) ( )
( )
tY y ty
Y
m t E e y e e dy
z
1
1
1
0
( )
( )
y e dy
y t
(Recall that
| `
. ,
F
H
G
I
K
J
z
1
1 1
0
( )
y e dy
y
t
Since we have previously shown that y e dy
y
z
1
0
/
( ) ,
F
H
G
I
K
J
1
1
( )
( )
t
F
H
G
I
K
J
1
1
1
1 ( )
t t
, if
1
t
< .
Recall that the chi-square distribution is a special case of the gamma distribution with
2 and / 2 . So it follows from Example 5 that if
2
~ ( ) Y , then
m t
t
Y
( )
/
F
H
G
I
K
J
1
1 2
2
.
From this moment generating function we can find the mean and variance of the chi-
square distribution with degrees of freedom.
NCSSM Statistics Leadership Institute Notes The Theory of Inference
15
We have ( ) ( ) ( )
( )
2
1
2
1
1 2 2
2
1 2
Y
m t t
t
+
| `
+
. ,
. So ( )
( )
2
2
0
2
2 1
2
0 2
1 2
|
t
m
t
+
| `
+
. ,
+
and
( )
2 2
2 2 Var Y + .
The variance is twice the degrees of freedom.
Theorem: Consider Y Y Y
n 1 2
, , ... independent random variables. Let W Y Y Y
n
+ + +
1 2
... .
Then m t m t
W Y
i
n
i
( ) ( )
1
.
Proof: m t E e E e
W
t Y tY
i
n
i i
( )
F
H
G
I
K
J
e j
1
by laws of exponents
E e
tY
i
n
i
d i
1
since Y
i
are independent
m t
Y
i
n
i
( )
1
Example 6
If Y has a binomial distribution with parameters ( , ) n p , show that the moment generating
function for Y is m t pe q
t
n
( ) +
c h
where q p 1 . Then use the moment generating
function to find E Y ( ) and V Y ( ) .
Solution:
1
n
i
i
Y X
'
1 0
( ) ( )
i
i
tX t t t
X
m t E e e p e q pe q
+ +
1 2
1
( ) ( ) ( ) ... ( )
( )
[ ]
n
Y X X X
n
t
t n
m t m t m t m t
pe q
pe q
+
+
+
+
( )
2 2 1
( 1)( ) ( ) ( )
t n t t n t
m t n n pe q pe n pe q pe
+ + +
2 2 2 1
( ) (0) ( 1)( ) ( )
n n
E Y m n n p q p n p q p
+ + +
( ) ( )
2 2 2 2
1 n n p np n p np p + +
( ) ( ) ( ) ( ) ( )
( )
2 2
2 2 2
1
1
V Y E Y E Y n p np p np
np p
+ ]
]
The Uniqueness Theorem is very important in using moment generating functions to find
the probability distribution of a function of random variables.
Uniqueness Theorem: Suppose that random variables X and Y have moment
generating functions given by m t
X
( ) and m t
Y
( ) respectively. If m t m t
X Y
( ) ( ) for all
values of t , then X and Y have the same probability distribution.
The proof of the Uniqueness Theorem is quite difficult and is not given here.
Example 7
Find the distribution of the random variable Y for each of the following moment-
generating functions:
(a) m t
t
( )
F
H
G
I
K
J
1
1 2
8
(b) m t e
t
( ) + 1 3 2 3
5
b g b g
(c) m t
e
e
t
t
( )
2
(d) m t e
e
t
( )
2 1
d i
Solution:
(a) Since m t ( ) can be written in the form
1
1 2
2
F
H
G
I
K
J
t
F
H
G
I
K
J
, Y must be a geometric random variable
with p 1 2 / .
NCSSM Statistics Leadership Institute Notes The Theory of Inference
17
(d) Since m t ( ) can be written in the form e
e
t
( ) 1
, Y must be a Poisson random variable
with 2.
Theorem: If W aY b + , then m t e m at
W
tb
Y
( ) ( ) .
Proof: m t E e E e
W
tW t aY b
( ) ( )
( )
+
c h
e E e e m at
bt atY bt
Y
( ) ( )
Theorem: If W aY b + , then E W aE Y b ( ) ( ) + and V W a V Y ( ) ( )
2
Proof: m t e m at
W
bt
Y
( ) ( )
+ m t e m at a m at e b
W
bt
Y Y
bt
( ) ( ) ( )
+ +
m m a m b aE Y E e b
W Y Y
Y
( ) ( ) ( ) ( ) ( ) 0 0 0
0
+ + aE Y E b aE Y b ( ) ( ) ( ) 1
+ + + m t ae m at a m at ae b m at e b be m at a
W
bt
Y Y
bt
Y
bt bt
Y
( ) ( ) ( ) ( ) ( )
2
+ + + m a m abm b m abm
W Y Y Y Y
( ) ( ) ( ) ( ) ( ) 0 0 0 0 0
2 2
+ + a E Y abE Y b
2 2 2
2 ( ) ( )
So V W E W E W a E Y abE Y b aE Y b ( ) ( ) ( ) ( ) ( ) ( ) + + +
2
2
2 2 2
2
2
+ + + + a E Y abE Y b a E Y abE Y b
2 2 2 2 2 2
2 2 ( ) ( ) ( ( )) ( )
a E Y a E Y
2 2 2
2
( ) ( )
a E Y E Y a V Y
2 2
2
2
( ) ( ) ( )
o t
Of course, this theorem can be proven without resorting to moment generating functions,
but we present the proof to show that mgf's give the familiar results.
Theorem: If Z N ~ ( , ) 0 1 and Y Z + , then Y N ~ ( , )
2
.
Proof: We know that m t e
Z
t
( )
2
2
.
m t e m t
Y
tu
Z
( ) ( ) (Applying the theorem proved above)
e e
tu t
2 2
2
+
e
tu
t
2 2
2
moment generating function of N( , )
2
Y N ~ ( , )
2
by the uniqueness theorem.
Theorem: If Y
1 1
~ ( ) Poisson and Y
2 2
~ ( ) Poisson and Y
1
and Y
2
are independent, then
W Y Y +
1 2
is Poisson( )
1 2
+ .
NCSSM Statistics Leadership Institute Notes The Theory of Inference
18
Proof: m t m t m t
W Y Y
( ) ( ) ( )
1 2
since Y
1
and Y
2
are independent
e e
e e
t t
1 2
1 1
e j e j
+
e
e
t
1 2
1 b gd i
+ moment generating function of Poisson( )
1 2
+ W Poisson ~ ( )
1 2
by the uniqueness theorem.
Theorem: Let W W W
n 1 2
, ,..., be independent chi-square random variables, each with one
degree of freedom. Let W W
i
n
1
. Then W
n
~
2
.
Proof:
m t m t m t t t
n
W
W
W
i
n
i
n
n
i
i
( ) ( ) ( )
1
1 2
1
2
2
1 2 1 2 b g b g
= moment generating function of with degrees of freedom
W
n
~
2
by the uniqueness theorem.
We have already considered the mean and variance of a chi-square distribution with n
degrees of freedom as a special case of the gamma distribution (see page 14). Now we
will give a more direct proof using the chi-square generating function.
Theorem: If W
n
~
2
, then E W n ( ) and V W n ( ) . 2
Proof: m t t
W
n
( ) ( )
1 2
2
m t
n
t
W
n
( ) ( ) ( )
2
1 2 2
2
1
m
n
n
n
( ) ( ) ( ) 0
2
1 0 2
2
1
So E W n ( ) .
F
H
G
I
K
J
F
H
G
I
K
J
m t
n n
t
W
n
( ) ( ) ( )( )
2 2
1 1 2 2 2
2
2
F
H
G
I
K
J +
F
H
G
I
K
J +
F
H
G
I
K
J
+ m
n n n n
n n
W
( ) 0 4
2 2
1 4
4 2
2
2
2
So V W E W E W n n n n ( ) ( ) [ ( )] +
2 2 2 2
2 2
Now consider Z N ~ , 0 1 b g and W Z
2
. We know the distribution of Z , and now we want
to show that the distribution of W is chi-square with one degree of freedom. There are at
least two ways to do this. The method of density functions is the harder way, and the
method of moment generating functions is the easier way. We will show both in the
sections that follow.
NCSSM Statistics Leadership Institute Notes The Theory of Inference
19
Method of Density Functions: Let F w
W
b g represent the cumulative distribution function of
W for 0. w >
F w P z w P w z w
W
( ) ( ) ( )
2
z z
1
2
2
2
2 2
2 2
0
e dz e dz
z
w
w
z
w
f w
d
dw
F w
d
dw
e dz
W W
z
w
b g
z
( )
2
2
2
2
0
Let u w . Then
du
dw w
1
2
and F u e dz
z
u
( )
z
2
2
2
2
0
.
dF
du
d
du
e dz e
z
u
u
z
2
2
2
2
2 2
2
0
2
By the chain rule,
dF
dw
dF
du
du
dw
e
w
u
2
2
1
2
2
2
2
2
1
2
2
2
1 2
e w
w d i
F
H
G
I
K
J
1
2
1
2
2 1 2
e w
w
Note that
1 2 1 2
1 2
1
, for >0
1
2
2
w
w e w
| `
. ,
Note that f w
W
b g is the density function for a Gamma random variable with
1
2
and
2 . Therefore W is a chi-square random variable with one degree of freedom.
Method of Moment Generating Functions:
m t E e E e e e dz
W
tW tZ tz z
( )
z
d i e j
2 2 2
1
2
2
z
1
2
2
2
1 2
e dz
z
t b g
z
1
2
2
1
2 1 2
e dz
z
t b g
L
N
M
M
O
Q
P
P
z
1
2
1
1 2
1
1 2
1 2 1 2
2 1 2
2
1
t t
e dz
z
t
b g b g
b g
z
1
1 2
1
2 1 2
1 2 1 2
2 1 2
2
1
t t
e dz
z
t
b g b g
b g
z
1
1 2
0 1 2
1 2
2
1
t
t dz
b g
b g Normal pdf with and
1
1 2
1 2
t b g
, for 0. t >
By the uniqueness theorem, this is the moment generating function of a chi-square random
variable with one degree of freedom.
Now we know that if Z N ~ ( , ) 0 1 , then Z
2
1
2
~ . We have previously shown that the
sum of n independent chi-square random variables with one degree of freedom is a chi-
square random variable with n degrees of freedom. These results lead to the following:
Given Y Y Y
n 1 2
, ,... , independent random variables with Y N
i i i
~ ( , )
2
and
Z
Y
N
i
i i
i
~ ( , ) 0 1 , then Z
i n
2 2
~
.
The beauty of moment generating functions is that they give many results with relative
ease. Proofs using moment generating functions are often much easier than showing the
same results using density functions (or in some other way).
Multivariate Moment Generating Functions
Following are several results concerning multivariate moment generating functions:
,
,
,
( , ) ( )
(0, ) ( ) ( )
( ,0) ( ) ( )
sU tV
U V
tV
U V V
sU
U V U
m s t E e
m t E e m t
m s E e m s
+
The following theorem will be important for proving results in sections that follow:
Theorem: U and V are independent if and only if
,
( , ) ( ) ( )
U V U V
m s t m s m t .