0% found this document useful (0 votes)
226 views

Moment-Generating Function

The moment-generating function (MGF) provides an alternative way to characterize a random variable's probability distribution compared to a probability density function. The MGF can be used to compute moments of a distribution and exists for some distributions where other characterizations do not. Key properties are that the MGF uniquely determines a distribution and its derivatives at 0 give the distribution's moments.

Uploaded by

brown222
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
226 views

Moment-Generating Function

The moment-generating function (MGF) provides an alternative way to characterize a random variable's probability distribution compared to a probability density function. The MGF can be used to compute moments of a distribution and exists for some distributions where other characterizations do not. Key properties are that the MGF uniquely determines a distribution and its derivatives at 0 give the distribution's moments.

Uploaded by

brown222
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Moment-generating function

In probability theory and statistics, the moment-generating function of a real-valued random variable is an
alternative specification of its probability distribution. Thus, it provides the basis of an alternative route to
analytical results compared with working directly with probability density functions or cumulative
distribution functions. There are particularly simple results for the moment-generating functions of
distributions defined by the weighted sums of random variables. However, not all random variables have
moment-generating functions.

As its name implies, the moment-generating function can be used to compute a distribution’s moments: the
nth moment about 0 is the nth derivative of the moment-generating function, evaluated at 0.

In addition to real-valued distributions (univariate distributions), moment-generating functions can be


defined for vector- or matrix-valued random variables, and can even be extended to more general cases.

The moment-generating function of a real-valued distribution does not always exist, unlike the
characteristic function. There are relations between the behavior of the moment-generating function of a
distribution and properties of the distribution, such as the existence of moments.

Definition
Let be a random variable with CDF . The moment generating function (mgf) of (or ), denoted
by , is

provided this expectation exists for in some neighborhood of 0. That is, there is an such that for all
in , exists. If the expectation does not exist in a neighborhood of 0, we say that the
moment generating function does not exist.[1]

In other words, the moment-generating function of X is the expectation of the random variable . More
generally, when , an -dimensional random vector, and is a fixed vector, one uses
instead of  :

always exists and is equal to 1. However, a key problem with moment-generating functions is that
moments and the moment-generating function may not exist, as the integrals need not converge absolutely.
By contrast, the characteristic function or Fourier transform always exists (because it is the integral of a
bounded function on a space of finite measure), and for some purposes may be used instead.

The moment-generating function is so named because it can be used to find the moments of the
distribution.[2] The series expansion of is
Hence

where is the th moment. Differentiating times with respect to and setting , we obtain
the th moment about the origin, ; see Calculations of moments below.

If is a continuous random variable, the following relation between its moment-generating function
and the two-sided Laplace transform of its probability density function holds:

since the PDF's two-sided Laplace transform is given as

and the moment-generating function's definition expands (by the law of the unconscious statistician) to

This is consistent with the characteristic function of being a Wick rotation of when the moment
generating function exists, as the characteristic function of a continuous random variable is the Fourier
transform of its probability density function , and in general when a function is of exponential
order, the Fourier transform of is a Wick rotation of its two-sided Laplace transform in the region of
convergence. See the relation of the Fourier and Laplace transforms for further information.

Examples
Here are some examples of the moment-generating function and the characteristic function for comparison.
It can be seen that the characteristic function is a Wick rotation of the moment-generating function
when the latter exists.
Moment-generating function
Distribution Characteristic function

Degenerate

Bernoulli

Geometric

Binomial

Negative binomial

Poisson

Uniform (continuous)

Uniform (discrete)

Laplace

Normal

Chi-squared

Noncentral chi-squared

Gamma

Exponential

(see Confluent
Beta
hypergeometric function)

Multivariate normal

Cauchy Does not exist

Multivariate Cauchy

[3] Does not exist

Calculation
The moment-generating function is the expectation of a function of the random variable, it can be written
as:

For a discrete probability mass function,


For a continuous probability density function,

In the general case: , using the Riemann–Stieltjes integral, and

where is the cumulative distribution function. This is simply the Laplace-Stieltjes transform
of , but with the sign of the argument reversed.

Note that for the case where has a continuous probability density function , is the two-
sided Laplace transform of .

where is the th moment.

Linear transformations of random variables

If random variable has moment generating function , then has moment generating
function

Linear combination of independent random variables

If , where the Xi are independent random variables and the ai are constants, then the

probability density function for Sn is the convolution of the probability density functions of each of the Xi,
and the moment-generating function for Sn is given by

Vector-valued random variables

For vector-valued random variables with real components, the moment-generating function is given by

where is a vector and is the dot product.

Important properties
Moment generating functions are positive and log-convex, with M(0) = 1.

An important property of the moment-generating function is that it uniquely determines the distribution. In
other words, if and are two random variables and for all values of t,

then

for all values of x (or equivalently X and Y have the same distribution). This statement is not equivalent to
the statement "if two distributions have the same moments, then they are identical at all points." This is
because in some cases, the moments exist and yet the moment-generating function does not, because the
limit

may not exist. The log-normal distribution is an example of when this occurs.

Calculations of moments

The moment-generating function is so called because if it exists on an open interval around t = 0, then it is
the exponential generating function of the moments of the probability distribution:

That is, with n being a nonnegative integer, the nth moment about 0 is the nth derivative of the moment
generating function, evaluated at t = 0.

Other properties
Jensen's inequality provides a simple lower bound on the moment-generating function:

where is the mean of X.

The moment-generating function can be used in conjunction with Markov's inequality to bound the upper
tail of a real random variable X. This statement is also called the Chernoff bound. Since is
monotonically increasing for , we have

for any and any a, provided exists. For example, when X is a standard normal distribution
and , we can choose and recall that . This gives , which
is within a factor of 1+a of the exact value.
Various lemmas, such as Hoeffding's lemma or Bennett's inequality provide bounds on the moment-
generating function in the case of a zero-mean, bounded random variable.

When is non-negative, the moment generating function gives a simple, useful bound on the moments:

For any and .

This follows from the inequality into which we can substitute implies
for any . Now, if and , this can be rearranged to
. Taking the expectation on both sides gives the bound on in terms of
.

As an example, consider with degrees of freedom. Then from the examples


. Picking and substituting into the bound:

We know that in this case the correct bound is . To compare the


bounds, we can consider the asymptotics for large . Here the moment-generating function bound is
, where the real bound is . The moment-
generating function bound is thus very strong in this case.

Relation to other functions


Related to the moment-generating function are a number of other transforms that are common in probability
theory:

Characteristic function
The characteristic function is related to the moment-generating function via
the characteristic function is the moment-generating function
of iX or the moment generating function of X evaluated on the imaginary axis. This function
can also be viewed as the Fourier transform of the probability density function, which can
therefore be deduced from it by inverse Fourier transform.
Cumulant-generating function
The cumulant-generating function is defined as the logarithm of the moment-generating
function; some instead define the cumulant-generating function as the logarithm of the
characteristic function, while others call this latter the second cumulant-generating
function.
Probability-generating function
The probability-generating function is defined as This immediately
implies that

See also
Characteristic function (probability theory)
Entropic value at risk
Factorial moment generating function
Rate function
Hamburger moment problem

References

Citations
1. Casella, George; Berger, Roger L. (1990). Statistical Inference. Wadsworth & Brooks/Cole.
p. 61. ISBN 0-534-11958-1.
2. Bulmer, M. G. (1979). Principles of Statistics. Dover. pp. 75–79. ISBN 0-486-63760-3.
3. Kotz et al. p. 37 using 1 as the number of degree of freedom to recover the Cauchy
distribution

Sources
Casella, George; Berger, Roger (2002). Statistical Inference (2nd ed.). pp. 59–68. ISBN 978-
0-534-24312-8.

Retrieved from "https://en.wikipedia.org/w/index.php?title=Moment-generating_function&oldid=1159976734"

You might also like