Lecture 4
Lecture 4
Hypothesis Testing
• We may wish to test prior hypotheses about the
coefficients we estimate.
• We can use the estimates to test whether the data rejects
our hypothesis.
• An example might be that we wish to test whether an
elasticity is equal to one.
• We may wish to test the hypothesis that X has no impact
on the dependent variable Y.
• We may wish to construct a confidence interval for our
coefficients.
• A hypothesis takes the form of a statement of the true
value for a coefficient or for an expression involving the
coefficient.
• The hypothesis to be tested is called the null hypothesis.
• The hypothesis which it is tested again is called the
alternative hypothesis.
• Rejecting the null hypothesis does not imply accepting the
alternative
• We will now consider testing the simple hypothesis that
the slope coefficient is equal to some fixed value.
Setting up the hypothesis
Y i = a + bX i + u i
• We wish to test the hypothesis that b=d where d is some known
value (for example zero) against the hypothesis that b is not
equal to zero. We write this as follows
• We write
H0 : b = d
Ha : b ≠ d
• To test the hypothesis we need to know the way that our
estimator is distributed.
• We start with the simple case where we assume that the
error term in the regression model is a normal random
variable with mean zero and variance σ . This is written
2
as u ~ N (0, σ 2 )
• Now recall that the OLS estimator can be written as
N
bˆ = b + ∑ wi ui
i =1
∑ uˆi2
i =1
σˆ =
2
N − 2
• Return now to hypothesis testing. Under the null hypothesis
b=d. Hence it must be the case that
bˆ− d
z= ~ N (0,1)
Var (bˆ)
• We now replace the variance by its estimated value to obtain a
test statistic:
bˆ− d
z =
*
σˆ2
N
∑ (X
i =1
i − X )2
bˆ− d
z =
*
~ tN −2
σˆ2
∑
i =1
( X i − X )2
∑ i
( X
i =1
− X ) 2
Number of obs = 51
------------------------------------------------------------------------------
lbp | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
log price | -.8421586 .1195669 -7.04 0.000 -1.082437 -.6018798
_cons | 4.52206 .1600375 28.26 0.000 4.200453 4.843668
• ------------------------------------------------------------------------------
• The statistic for the hypothesis that the elasticity is equal to one is
1 N α
N
∑ (v
i =1
i − µ) ~ N ( 0, s 2 )
α
• Where the symbol ~ reads “distributed asymptotically”,
i.e. as the sample size N tends to infinity.
• This extends to weighted sums. Let =0. So we also have that
1 N α 2 1
∑ (wi vi ) ~ N0, s plimN→∞ ∑wi
2
N i=1 N
1
where p lim N →∞
N
∑ w i2
1
We require the limit to be finite: p lim N →∞
N
∑w i
2
<∞
Applying the CLT to the slope coefficient for
OLS
• Recall that the OLS estimator can be written as
N
∑ (X i − X )( u i − u ) N
bˆ− b = i =1
N
= ∑wu i i
∑
i =1
(X i − X )2 i =1
1 N
∑ ( X i − X ) u i α
0, σ 2 p lim
N (bˆ− b ) =
N i =1
~ N 1
1 N N →∞
1 N
∑ (X i − X ) 2
N
∑ ( X i − X )
2
N i =1 i =1
bˆ− d α
z =
*
~ N ( 0 ,1)
σˆ2
∑ i
( X
i =1
− X ) 2
• Note how the Ns cancel from the top and bottom. In fact the test
statistic is identical to the one we used under normality. The
only difference is that now we will use the critical values of the
Normal distribution. For a size of 5% these are +1.96 and –
1.96.
• The expression on the denominator is nothing
but the standard error of the estimator.