0% found this document useful (0 votes)
8 views

EstimationTheory Lecture 02 (1)

Uploaded by

Yankho Mtande
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

EstimationTheory Lecture 02 (1)

Uploaded by

Yankho Mtande
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Method of moments

Odoi Makuru
August 27, 2023
Introduction
• The method of moments is a very simple procedure for finding an estimator for
one or more population parameters
• Recall that the kth moment of a random variable, taken about the origin, is

µ′k = E X k
 

• The corresponding kth sample moment is the average


n
1X k
m′k = Xi
n
i=1

The method of moments is based on the intuitively appealing idea that sample
moments should provide good estimates of the corresponding population
moments
Definition
Choose as estimates those values of the parameters that are solutions of the
equations
µ′k = m′k for k = 1, 2, ... , t,

where t is the number of parameters to be estimated and µ′k is a function of


the population parameters
The method of moments procedure

Suppose there are t parameters to be estimated, say θ = (θ1 , ... , θt )

a) Find t population moments, µ′k , for k = 1, 2, ... , t. µ′k will contain one or more
parameters θ1 , ... , θt
b) Find the corresponding t sample moments, m′k , for k = 1, 2, ... , t. The number of
sample moments should equal the number of parameters to be estimated
c) From the system of equations, µ′k = m′k , for k = 1, 2, ... , t, solve for the parameter
θ = (θ1 , ... , θt ); this will be a moment estimator of θ̂

Method of moments
note 1: expectation
For a probability function f (x),

Example 1
P
 xf (x) discrete
Let X1 , ... , Xn be a random sample E[X ] = Rx
xf (x)dx continuous

from a gamma probability
distribution with parameters α and
β. Find moment estimators for the note 2: variance
unknown parameters α and β The variance of a random variable X
is defined by

Var(X ) = E(X − µ)2


Solution to example 1

For the gamma distribution


h i
µ′1 = E[X ] = αβ and µ′2 = E X 2 = αβ 2 + α2 β 2

Because there are two parameters, we need to find the first two moment estimators.
Equating sample moments to distribution (theoretical) moments, we have
n
1X
µ′1 = m′1 =⇒ Xi = X = αβ
n
i=1

and
n
1X 2
µ′2 = m′2 =⇒ Xi = αβ 2 + α2 β 2
n
i=1

Method of moments
Solving for α and β we obtain the estimates as
x
X = αβ =⇒ α=
β
and
n    2
1X 2 x x
xi =αβ 2 + α2 β 2 = β2 + β2
n β β
i=1

=xβ + x 2

which implies that


n
1
xi2 − x 2
P
n
i=1
β=
x
Therefore, the method of moments estimators for α and β are

X
α̂ =
β̂
and
n 2 n 2
1
Xi2 − X
P P
n Xi − X
i=1 i=1
β̂ = =
X nX
So,
2 2
X X X
α̂ = = n = n
β̂ 1 P
Xi2 − X
2 P
Xi − X
2
n
i=1 i=1

Thus, we can use these values in the gamma probability density function to answer
questions concerning the probabilistic behavior of the random variable X
Example 2
Let the distribution of X be N(µ, σ 2 )
a) For a given sample of size n, use the method of moments to estimate µ
and σ 2
b) The following data is from normal distribution with mean 2 and a standard
deviation of 1.5
3.163 1.883 3.252 3.716 -0.049 -0.653 0.057 2.987
4.098 1.670 1.396 2.332 1.838 3.024 2.706 0.231
3.830 3.349 -0.230 1.496
Obtain the method of moments estimates of the true mean and the true
variance
Solution to example 2 a)

For the normal distribution, E [X ] = µ, and because Var(X ) = E X 2 − µ2 , we have the


 

second moment as E X 2 = σ 2 + µ2
 

Equating sample moments to distribution moments we have


n
1X
Xi = µ′1 = µ
n
i=1

and
n
1X 2
µ′2 = Xi = σ 2 + µ2
n
i=1

Solving for µ and σ 2 , we obtain the moment estimators as

Method of moments
µ̂ = X

and
n n
1X 2 2 1X 2
σ̂ 2 = Xi − X = Xi − X
n n
i=1 i=1

Aside

n n n
1X 2 1 X   1 X 2 2

Xi − X = Xi − X Xi − X = Xi − 2Xi X − X
n n n
i=1 i=1 i=1
n n n n
!
1 X 1 X 1X 2 1X 2 2 2
= Xi2 − 2X Xi − X = Xi − 2X + X
n n n n
i=1 i=1 i=1 i=1
n
1 X 2
= Xi2 − X
n
i=1
Solution to example 2 b)

Because we know that the estimator of the mean is

µ̂ = X

and the estimator of the variance is


n
1X 2 2
σ̂ 2 = X −X ,
n
i=1

from the data the estimates are µ̂ = 2.005, and σ̂ 2 = 6.12 − (2.005)2 = 2.1
Notice that the true mean is 2 and the true variance is 2.25, which we used to simulate
the data

Method of moments
Lecture 2 ends here

Reading assignment
Do all examples in Section 5.2 and do Exercise 5.2 of Kandethody
M.Ramachandran & Chris P.Tsokos, (2009) Mathematical Statistics with
Applications, ISBN 13: 978-0-12-374848-5

Method of moments
References

A. Hayter.
Probability and Statistics for Engineers and Scientists.
Brooks/Cole, Cengage Learning, 2012.
K. M. Ramachandran and C. P. Tsokos.
Mathematical Statistics with Applications:.
Elsevier Academic Press, 2009.
D. D. Wackerly, W. Mendenhall, and R. L. . Scheaffer.
Mathematical Statistics with Applications.
Brooks/Cole, Cengage Learning, 2008.

Method of moments

You might also like