0% found this document useful (0 votes)
44 views

Sampling Distributions and Point Estimation of Parameters PDF

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views

Sampling Distributions and Point Estimation of Parameters PDF

Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 48

CH.

7 Sampling Distributions and


Point Estimation of Parameters

Introduction
Parameter estimation, sampling distribution, statistic, point estimator,
point estimate

Sampling distribution and the Central Limit Theorem


General Concepts of Point Estimation

Unbiased estimators
Variance of a point estimator
Standard error of an estimator
Mean squared error of an estimator

Methods of point estimation


Method of moments
Method of maximum likelihood

7-1 Introduction
The field of statistical inference consists of those
methods used to make decisions or to draw conclusions
about a population.
These methods utilize the information contained in a
sample from the population in drawing conclusions.
Statistical inference may be divided into two major
areas:
Parameter estimation
Hypothesis testing

7-1 Introduction
Parameter estimation examples
Estimate the mean fill volume of soft drink cans: Soft drink cans
are filled by automated filling machines. The fill volume may
vary because of differences in soft drinks, automated machines,
and measurement procedures.
Estimate the mean diameter of bottle openings of a specific
bottle manufactured in a plant: The diameter of bottle openings
may vary because of machines, molds and measurements.
3

7-1 Introduction
Hypothesis testing example
2 machine types are used for filling soft drink cans: m1 and m2
You have a hypothesis that m1 results in larger fill volume of
soft drink cans than does m2.
Construct the statistical hypothesis as follows:
the mean fill volume using machime m1 is larger than the
mean fill volume using machine m2.
Try to draw conclusions about a stated hypothesis.
4

7-1 Introduction

Since a statistic is a random variable, it has a probability distribution.


The probability distribution of a statistic is called a sampling distribution

Definition

7-1 Introduction

point
estimate

point
estimator

x1=25, x2=30, x3=29, x4=31

25 + 30 + 29 + 31
x=
= 28.75
4

Point estimate of

7-1 Introduction

point
2
s
estimate

2 point

estimator

x1=25, x2=30, x3=29, x4=31


n

s =
2

2
(
x

x
)
i
i =1

n 1

= 6.9

Point estimate of

7-1 Introduction

7-1 Introduction

7.2 Sampling Distributions and the


Central Limit Theorem
Statistical inference is concerned with making decisions about a
population based on the information contained in a random
sample from that population.

Definitions:

10

7.2 Sampling Distributions and the


Central Limit Theorem

The probability distribution of X is called the sampling distribution of mean.


Suppose that a random sample of size n is taken from a normal population with mean and
variance 2.

Each observation X1, X2,,Xn is normally and independently distributed with mean and
variance 2

Linear functions of independent normally distributed random variables are also normally
distributed. (Reproductive property of Normal Distr.)

The sample mean

with mean

and variance

X=

x =
X2 =

X 1 + X 2 + ... + X n
n

+ + ... +
n

2 + 2 + ... + 2
n

has a normal distribution

2
X N ,

n
11

7.2 Sampling Distributions and the


Central Limit Theorem

12

7.2 Sampling Distributions and the


Central Limit Theorem
Distributions of average scores
from throwing dice.
Practically
if n 30, the normal approximation
will be satisfactory regardless of the
shape of the population.
f n<30, the central limit theorem will
work if the distribution of the
population is not severely nonnormal.
(continuous, unimodal, symmetric)

13

7.2 Sampling Distributions and the


Central Limit Theorem
Example 7-1

14

7.2 Sampling Distributions and the


Central Limit Theorem

Figure 7-2 Probability for Example 7-1


15

7.2 Sampling Distributions and the


Central Limit Theorem
Example 7-2
X has a continuous distribution:
Find the distributon of the sample
mean of a random sample of size
n=40?

1/ 2, 4 x 6
f ( x) =
otherwise
0,

=5

2 = (6 4)2 /12 = 1/ 3

By C.L.T, X is approximately normally


distributed with

X =5

X2 =1/[3(40)] =1/120
16

7.2 Sampling Distributions and the


Central Limit Theorem

Two independent populations


2
1st population has mean 1 and variance 1
2nd population has mean and variance 22
2

The statistic

mean

X1 X 2

has the following mean and variance :

X X = X X = 1 2

and variance

X2 X = X2 + X2 =
1

12
n1

22
n2
17

7.2 Sampling Distributions and the


Central Limit Theorem
Approximate Sampling Distribution of a
Difference in Sample Means

18

7.2 Sampling Distributions and the


Central Limit Theorem Ex.7-3

Two independent and approximately normal populations


1 =5000 1 =40
X1: life of a component with old process
X2: life of a component with improved process 2 =5050 2 =30
n1=16 , n2=25
What is the probability that the difference in the two sample means X 2 X 1
is at least 25 hours?

X = 1 = 5000
1

X = 2 = 5050
2

12

402
=
=
= 100
16
n1
22 302
2
X2 =
=
= 36
25
n2
2
X1

25 X 2 X1
P( X 2 X 1 25) = P Z

X 2 X1

2 X1

X2

2 X1

= X 2 X1 = 50
= X2 2 + X2 1 = 136

25 50

= PZ
= P ( Z 2.14 ) = 0.9838

136

19

7-3 General Concepts of Point Estimation


We may have several different choices for the point
estimator of a parameter. Ex: to estimate the mean of a
population
Sample mean
Sample median
The average of the smallest and largest observations in the sample

Which point estimator is the best one?


Need to examine their statistical properties and develop
some criteria for comparing estimators
For instance, an estimator should be close to the true value
of the unknown parameter
20

7-3 General Concepts of Point Estimation


7-3.1 Unbiased Estimators
Definition

bias =

When an estimator is unbiased, the bias is zero.


21

7-3 General Concepts of Point Estimation


Are X and S 2 unbiased estimators of and 2?

1
1
1 9
1

E ( X ) = E X 1 + X 2 + ... + X n = n =
n
n
n
n

Unbiased !

2
n
( Xi X )
2
1
n
2
i =1

=
E (S ) = E
E ( Xi X )
n 1

n 1 i =1

n
n
1
1
n
n

2
2
2
=
E ( X i 2 XX i + X ) =
E X i 2 XX i + X 2
n 1 i =1
i =1
i =1
n 1 i =1

1
1
n
n

2
2
2
=
E X i 2nX + nX =
E X i 2 nX 2
n 1 i =1
n 1 i =1

1 n
2
2
=

E
X
nE
X
(
)
(
)

n 1 i =1

22

7-3 General Concepts of Point Estimation


Example 7-1 (continued)
V ( X i2 ) = 2 = E( X i2 ) 2 E( X i2 ) = 2 + 2
V (X ) =

2
n

= E( X ) ( E( X )) = E( X ) E( X ) =
2

2
n

+ 2

9
Unbiased !
23

7-3 General Concepts of Point Estimation


There is not a unique unbiased estimator.
n=10 data 12.8 9.4 8.7 11.6 13.1 9.8 14.1 8.5 12.1 10.3
There are several unbiased estimators of

Sample mean (11.04)


Sample median (10.95)
The average of the smallest and largest observations in the sample (11.3)
A single observation from the population (12.8)

Cannot rely on the property of unbiasedness alone to select the


estimator.
Need a method to select among unbiased estimators.
24

7-3 General Concepts of Point Estimation


7-3.2 Variance of a Point Estimator
Definition

The sampling distributions of two


unbiased estimators

and
.

1
2

25

7-3 General Concepts of Point Estimation


7-3.2 Variance of a Point Estimator

V (Xi ) = 2
V (X ) =

n
V (X ) < V (Xi )

for n 2

The sample mean is better estimator of than a single observation Xi


26

7-3 General Concepts of Point Estimation


7-3.3 Standard Error: Reporting a Point Estimate
Definition

27

7-3 General Concepts of Point Estimation


7-3.3 Standard Error: Reporting a Point Estimate

28

7-3 General Concepts of Point Estimation


Example 7-5

29

7-3 General Concepts of Point Estimation


Example 7-5 (continued)

Probability that true mean is within

X 2 X

is 0.9545
30

7-3 General Concepts of Point Estimation


7-3.4 Mean Square Error of an Estimator
There may be cases where we may need to use a biased estimator.
So we need another comparison measure:

Definition

31

7-3 General Concepts of Point Estimation


7-3.4 Mean Square Error of an Estimator

) = E (
)2
MSE (
2

E (
) + E (
)
= E

= V ()
+ (bias )

is equal to the variance of the estimator plus the squared bias.


The MSE of
If

is an unbiased estimator of , MSE of


is equal to the variance of

32

7-3 General Concepts of Point Estimation


7-3.4 Mean Square Error of an Estimator
) + (bias ) 2
V (
2
2
E (
) + E (
)
E

2 2
E (
) + E 2 (
) + 2 2 E (
) + E 2 (
)
= E

2 ) 2 E (
) E (
) + E 2 (
) + 2 2 E (
) + E 2 (
)
= E (
2 ) E 2 (
) + 2 2 E (
) + E 2 (
)
= E (
2 ) + 2 2 E (
)
= E (
9
)2
= E (

)
MSE (

33

7-3 General Concepts of Point Estimation


7-3.4 Mean Square Error of an Estimator

34

7-3 General Concepts of Point Estimation


7-3.4 Mean Square Error of an Estimator

biased but has


small variance

unbiased but has

large variance

An estimate based on

would more likely be close to the true value of

than would an estimate based on

..

2
35

7-3 General Concepts of Point Estimation


7-3.4 Mean Square Error of an Estimator
Exercise: Calculate the MSE of the following estimators.

=X

1
=X

36

7-4 Methods of Point Estimation


How can good estimators be obtained?
7-4.1 Method of Moments

37

7-4 Methods of Point Estimation


Example 7-7

E(X)=

38

7-4 Methods of Point Estimation


7-4.2 Method of Maximum Likelihood

39

7-4 Methods of Point Estimation


Example 7-9

40

7-4 Methods of Point Estimation


Example 7-9 (continued)
y = ln f ( x)
dy f ( x)
=
dx f ( x)

41

7-4 Methods of Point Estimation


Example 7-12

y = ( ax + b ) m
dy
= ma ( ax + b ) m 1
dx

42

7-4 Methods of Point Estimation


Example 7-12 (continued)

43

7-4 Methods of Point Estimation


Properties of the Maximum Likelihood Estimator

44

7-4 Methods of Point Estimation


Properties of the Maximum Likelihood Estimator
MLE of 2 is
n
1
2 = ( X i X ) 2
n i =1
n 1 2
2
E ( ) =

n
2

bias = E ( 2 ) 2 =
n

bias is negative. MLE for 2 tends to underestimate 2


The bias approaches zero as n increases.
MLE for 2 is an asymptotically unbiased estimator for 2

45

7-4 Methods of Point Estimation


The Invariance Property

46

7-4 Methods of Point Estimation


Example 7-13

47

7-4 Methods of Point Estimation


Complications in Using Maximum Likelihood Estimation
It is not always easy to maximize the likelihood
function because the equation(s) obtained from
dL()/d = 0 may be difficult to solve.
It may not always be possible to use calculus
methods directly to determine the maximum of L().

48

You might also like