Minimum Variance Unbiased Estimators
Minimum Variance Unbiased Estimators
The Cramer–Rao bound gives a lower bound for the variance of unbiased
estimators. In some sense this is helpful only if we can find an unbiased
estimator with variance equal to this bound. If that is the case we know
this is the minimum variance unbiased estimator. If not, there are two
possibilities. Either we missed the minimum variance unbiased estimator, or
we have an minimum variance unbiased estimator with variance larger than
the bound. In many cases a minimum variance unbiased estimator does not
even exist. To demonstrate some of these possibilities, consider the following
examples. We have already seen an example where the bound does not apply.
Example
√ 3.1. Suppose X has a binomial distribution with parameters 1
and θ. Any estimator for θ can be written as
There are no α and β that make this equal to θ, and so there is no unbiased
estimator for θ, let alone one that achieves the Cramer–Rao bound.
1
Statistics II 2017
Marcelo J. Moreira FGV/EPGE
2
Statistics II 2017
Marcelo J. Moreira FGV/EPGE
Finally, by setting the derivative of the log of the density equal to zero,
combined with a negative second derivative, we have maximized the log of
the density, or the log likelihood and so under these conditions the minimum
variance unbiased estimator W (X) is equal to the maximum likelihood esti-
mator.
Problems
3
Statistics II 2017
Marcelo J. Moreira FGV/EPGE