0% found this document useful (0 votes)
18 views

U X X Y H

Uploaded by

Emiraslan Mhrrov
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views

U X X Y H

Uploaded by

Emiraslan Mhrrov
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 64

F TESTS OF GOODNESS OF FIT

Y   1   2 X 2  ...   k X k  u

H 0 :  2  ...   k  0
H 1 : at least one   0

This sequence describes two F tests of goodness of fit in a multiple regression model. The
first relates to the goodness of fit of the equation as a whole.

© Christopher Dougherty 1999–2006 1


F TESTS OF GOODNESS OF FIT

Y   1   2 X 2  ...   k X k  u

H 0 :  2  ...   k  0
H 1 : at least one   0

We will consider the general case where there are k – 1 explanatory variables. For the F
test of goodness of fit of the equation as a whole, the null hypothesis, in words, is that the
model has no explanatory power at all.
© Christopher Dougherty 1999–2006 2
F TESTS OF GOODNESS OF FIT

Y   1   2 X 2  ...   k X k  u

H 0 :  2  ...   k  0
H 1 : at least one   0

Of course we hope to reject it and conclude that the model does have some explanatory
power.

© Christopher Dougherty 1999–2006 3


F TESTS OF GOODNESS OF FIT

Y   1   2 X 2  ...   k X k  u

H 0 :  2  ...   k  0
H 1 : at least one   0

The model will have no explanatory power if it turns out that Y is unrelated to any of the
explanatory variables. Mathematically, therefore, the null hypothesis is that all the
coefficients 2, ..., k are zero.
© Christopher Dougherty 1999–2006 4
F TESTS OF GOODNESS OF FIT

Y   1   2 X 2  ...   k X k  u

H 0 :  2  ...   k  0
H 1 : at least one   0

The alternative hypothesis is that at least one of these  coefficients is different from zero.

© Christopher Dougherty 1999–2006 5


F TESTS OF GOODNESS OF FIT

Y   1   2 X 2  ...   k X k  u

H 0 :  2  ...   k  0
H 1 : at least one   0

In the multiple regression model there is a difference between the roles of the F and t tests.
The F test tests the joint explanatory power of the variables, while the t tests test their
explanatory power individually.
© Christopher Dougherty 1999–2006 6
F TESTS OF GOODNESS OF FIT

Y   1   2 X 2  ...   k X k  u

H 0 :  2  ...   k  0
H 1 : at least one   0

In the simple regression model the F test was equivalent to the (two-sided) t test on the
slope coefficient because the ‘group’ consisted of just one variable.

© Christopher Dougherty 1999–2006 7


F TESTS OF GOODNESS OF FIT

Y   1   2 X 2  ...   k X k  u

H 0 :  2  ...   k  0
H 1 : at least one   0

ESS ( k  1)
F ( k  1, n  k ) 
RSS ( n  k )
ESS
( k  1)
TSS R 2
( k  1)
 
RSS (1  R 2
) (n  k )
(n  k )
TSS

The F statistic for the test was defined in the last sequence in Chapter 2. ESS is the
explained sum of squares and RSS is the residual sum of squares.

© Christopher Dougherty 1999–2006 8


F TESTS OF GOODNESS OF FIT

Y   1   2 X 2  ...   k X k  u

H 0 :  2  ...   k  0
H 1 : at least one   0

ESS ( k  1)
F ( k  1, n  k ) 
RSS ( n  k )
ESS
( k  1)
TSS R 2
( k  1)
 
RSS (1  R 2
) (n  k )
(n  k )
TSS

It can be expressed in terms of R2 by dividing the numerator and denominator by TSS, the
total sum of squares.

© Christopher Dougherty 1999–2006 9


F TESTS OF GOODNESS OF FIT

Y   1   2 X 2  ...   k X k  u

H 0 :  2  ...   k  0
H 1 : at least one   0

ESS ( k  1)
F ( k  1, n  k ) 
RSS ( n  k )
ESS
( k  1)
TSS R 2
( k  1)
 
RSS (1  R 2
) (n  k )
(n  k )
TSS

ESS / TSS is the definition of R2. RSS / TSS is equal to (1 – R2). (See the last sequence in
Chapter 2.)

© Christopher Dougherty 1999–2006 10


F TESTS OF GOODNESS OF FIT

S   1   2 ASVABC   3 SM   4 SF  u

The educational attainment model will be used as an example. We will suppose that S
depends on ASVABC, the ability score, and SM, and SF, the highest grade completed by the
mother and father of the respondent, respectively.
© Christopher Dougherty 1999–2006 11
F TESTS OF GOODNESS OF FIT

S   1   2 ASVABC   3 SM   4 SF  u
H0 : 2  3  4  0

The null hypothesis for the F test of goodness of fit is that all three slope coefficients are
equal to zero. The alternative hypothesis is that at least one of them is non-zero.

© Christopher Dougherty 1999–2006 12


F TESTS OF GOODNESS OF FIT

S   1   2 ASVABC   3 SM   4 SF  u
H0 : 2  3  4  0
. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------

Here is the regression output using Data Set 21.

© Christopher Dougherty 1999–2006 13


F TESTS OF GOODNESS OF FIT

S   1   2 ASVABC   3 SM   4 SF  u
H0 : 2  3  4  0
. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
ESS /( k  1) 1181 / 3
F ( k  1, n  k )  F ( 3,536)   104.3
RSS /( n  k ) 2024 / 536
In this example, k – 1, the number of explanatory variables, is equal to 3 and n – k, the
number of degrees of freedom, is equal to 536.

© Christopher Dougherty 1999–2006 14


F TESTS OF GOODNESS OF FIT

S   1   2 ASVABC   3 SM   4 SF  u
H0 : 2  3  4  0
. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
ESS /( k  1) 1181 / 3
F ( k  1, n  k )  F ( 3,536)   104.3
RSS /( n  k ) 2024 / 536
The numerator of the F statistic is the explained sum of squares divided by k – 1. In the
Stata output these numbers are given in the Model row.

© Christopher Dougherty 1999–2006 15


F TESTS OF GOODNESS OF FIT

S   1   2 ASVABC   3 SM   4 SF  u
H0 : 2  3  4  0
. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
ESS /( k  1) 1181 / 3
F ( k  1, n  k )  F ( 3,536)   104.3
RSS /( n  k ) 2024 / 536
The denominator is the residual sum of squares divided by the number of degrees of
freedom remaining.

© Christopher Dougherty 1999–2006 16


F TESTS OF GOODNESS OF FIT

S   1   2 ASVABC   3 SM   4 SF  u
H0 : 2  3  4  0
. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
ESS /( k  1) 1181 / 3
F ( k  1, n  k )  F ( 3,536)   104.3
RSS /( n  k ) 2024 / 536
Hence the F statistic is 104.3. All serious regression packages compute it for you as part of
the diagnostics in the regression output.

© Christopher Dougherty 1999–2006 17


F TESTS OF GOODNESS OF FIT

S   1   2 ASVABC   3 SM   4 SF  u
H0 : 2  3  4  0
. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
1181 / 3
Fcrit,0.1% ( 3,500)  5.51 F ( 3,536)   104.3
2024 / 536
The critical value for F(3,536) is not given in the F tables, but we know it must be lower than
F(3,500), which is given. At the 0.1% level, this is 5.51. Hence we easily reject H0 at the
0.1% level.
© Christopher Dougherty 1999–2006 18
F TESTS OF GOODNESS OF FIT

S   1   2 ASVABC   3 SM   4 SF  u
H0 : 2  3  4  0
. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
1181 / 3
Fcrit,0.1% ( 3,500)  5.51 F ( 3,536)   104.3
2024 / 536
This result could have been anticipated because both ASVABC and SF have highly
significant t statistics. So we knew in advance that both 2 and 4 were non-zero.

© Christopher Dougherty 1999–2006 19


F TESTS OF GOODNESS OF FIT

S   1   2 ASVABC   3 SM   4 SF  u
H0 : 2  3  4  0
. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
1181 / 3
Fcrit,0.1% ( 3,500)  5.51 F ( 3,536)   104.3
2024 / 536
It is unusual for the F statistic not to be significant if some of the t statistics are significant.
In principle it could happen though. Suppose that you ran a regression with 40 explanatory
variables, none being a true determinant of the dependent variable.
© Christopher Dougherty 1999–2006 20
F TESTS OF GOODNESS OF FIT

S   1   2 ASVABC   3 SM   4 SF  u
H0 : 2  3  4  0
. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
1181 / 3
Fcrit,0.1% ( 3,500)  5.51 F ( 3,536)   104.3
2024 / 536
Then the F statistic should be low enough for H0 not to be rejected. However, if you are
performing t tests on the slope coefficients at the 5% level, with a 5% chance of a Type I
error, on average 2 of the 40 variables could be expected to have ‘significant’ coefficients.
© Christopher Dougherty 1999–2006 21
F TESTS OF GOODNESS OF FIT

S   1   2 ASVABC   3 SM   4 SF  u
H0 : 2  3  4  0
. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
1181 / 3
Fcrit,0.1% ( 3,500)  5.51 F ( 3,536)   104.3
2024 / 536
The opposite can easily happen, though. Suppose you have a multiple regression model
which is correctly specified and the R2 is high. You would expect to have a highly
significant F statistic.
© Christopher Dougherty 1999–2006 22
F TESTS OF GOODNESS OF FIT

S   1   2 ASVABC   3 SM   4 SF  u
H0 : 2  3  4  0
. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
1181 / 3
Fcrit,0.1% ( 3,500)  5.51 F ( 3,536)   104.3
2024 / 536
However, if the explanatory variables are highly correlated and the model is subject to
severe multicollinearity, the standard errors of the slope coefficients could all be so large
that none of the t statistics is significant.
© Christopher Dougherty 1999–2006 23
F TESTS OF GOODNESS OF FIT

S   1   2 ASVABC   3 SM   4 SF  u
H0 : 2  3  4  0
. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
1181 / 3
Fcrit,0.1% ( 3,500)  5.51 F ( 3,536)   104.3
2024 / 536
In this situation you would know that your model is a good one, but you are not in a
position to pinpoint the contributions made by the explanatory variables individually.

© Christopher Dougherty 1999–2006 24


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

We now come to the other F test of goodness of fit. This is a test of the joint explanatory
power of a group of variables when they are added to a regression model.

© Christopher Dougherty 1999–2006 25


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

For example, in the original specification, Y may be written as a simple function of X2. In
the second, we add X3 and X4.

© Christopher Dougherty 1999–2006 26


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 3  4  0
H 1 :  3  0 or  4  0 or both  3 and  4  0

The null hypothesis for the F test is that neither X3 nor X4 belongs in the model. The
alternative hypothesis is that at least one of them does, perhaps both.

© Christopher Dougherty 1999–2006 27


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 3  4  0
H 1 :  3  0 or  4  0 or both  3 and  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

For this F test, and for several others which we will encounter, it is useful to think of the F
statistic as having the structure indicated above.

© Christopher Dougherty 1999–2006 28


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 3  4  0
H 1 :  3  0 or  4  0 or both  3 and  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

The ‘improvement’ is the reduction in the residual sum of squares when the change is
made, in this case, when the group of new variables is added.

© Christopher Dougherty 1999–2006 29


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 3  4  0
H 1 :  3  0 or  4  0 or both  3 and  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

The ‘cost’ is the reduction in the number of degrees of freedom remaining after making the
change. In the present case it is equal to the number of new variables added, because that
number of new parameters are estimated.
© Christopher Dougherty 1999–2006 30
F TESTS OF GOODNESS OF FIT

Y  1   2 X 2  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 3  4  0
H 1 :  3  0 or  4  0 or both  3 and  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

(Remember that the number of degrees of freedom in a regression equation is the number
of observations, less the number of parameters estimated. In this example, it would fall
from n – 2 to n – 4 when X3 and X4 are added.)
© Christopher Dougherty 1999–2006 31
F TESTS OF GOODNESS OF FIT

Y  1   2 X 2  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 3  4  0
H 1 :  3  0 or  4  0 or both  3 and  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

The ‘remaining unexplained’ is the residual sum of squares after making the change.

© Christopher Dougherty 1999–2006 32


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 3  4  0
H 1 :  3  0 or  4  0 or both  3 and  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

The ‘degrees of freedom remaining’ is the number of degrees of freedom remaining after
making the change.

© Christopher Dougherty 1999–2006 33


F TESTS OF GOODNESS OF FIT

. reg S ASVABC

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 1, 538) = 274.19
Model | 1081.97059 1 1081.97059 Prob > F = 0.0000
Residual | 2123.01275 538 3.94612035 R-squared = 0.3376
-------------+------------------------------ Adj R-squared = 0.3364
Total | 3204.98333 539 5.94616574 Root MSE = 1.9865

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .148084 .0089431 16.56 0.000 .1305165 .1656516
_cons | 6.066225 .4672261 12.98 0.000 5.148413 6.984036
------------------------------------------------------------------------------

We will illustrate the test with an educational attainment example. Here is S regressed on
ASVABC using Data Set 21. We make a note of the residual sum of squares.

© Christopher Dougherty 1999–2006 34


F TESTS OF GOODNESS OF FIT

. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------

Now we have added the highest grade completed by each parent. Does parental education
have a significant impact? Well, we can see that a t test would show that SF has a highly
significant coefficient, but we will perform the F test anyway. We make a note of RSS.
© Christopher Dougherty 1999–2006 35
F TESTS OF GOODNESS OF FIT

Y  1   2 X 2  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 3  4  0
H 1 :  3  0 or  4  0 or both  3 and  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

( RSS1  RSS 2 ) 2 ( 2123.0  2023.6) / 2


F ( 2,540  4)    13.16
RSS 2 (540  4) 2023.6 / 536

The improvement in the fit on adding the parental variables is the reduction in the residual
sum of squares.

© Christopher Dougherty 1999–2006 36


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 3  4  0
H 1 :  3  0 or  4  0 or both  3 and  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

( RSS1  RSS 2 ) 2 ( 2123.0  2023.6) / 2


F ( 2,540  4)    13.16
RSS 2 (540  4) 2023.6 / 536

The cost is 2 degrees of freedom because 2 additional parameters have been estimated.

© Christopher Dougherty 1999–2006 37


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 3  4  0
H 1 :  3  0 or  4  0 or both  3 and  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

( RSS1  RSS 2 ) 2 ( 2123.0  2023.6) / 2


F ( 2,540  4)    13.16
RSS 2 (540  4) 2023.6 / 536

The remaining unexplained is the residual sum of squares after adding SM and SF.

© Christopher Dougherty 1999–2006 38


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 3  4  0
H 1 :  3  0 or  4  0 or both  3 and  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

( RSS1  RSS 2 ) 2 ( 2123.0  2023.6) / 2


F ( 2,540  4)    13.16
RSS 2 (540  4) 2023.6 / 536

The number of degrees of freedom remaining is n – k, that is, 540 – 4 = 536.

© Christopher Dougherty 1999–2006 39


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 3  4  0
H 1 :  3  0 or  4  0 or both  3 and  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

( RSS1  RSS 2 ) 2 ( 2123.0  2023.6) / 2


F ( 2,540  4)    13.16
RSS 2 (540  4) 2023.6 / 536

The F statistic is 13.16.

© Christopher Dougherty 1999–2006 40


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 3  4  0
H 1 :  3  0 or  4  0 or both  3 and  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

( RSS1  RSS 2 ) 2 ( 2123.0  2023.6) / 2


F ( 2,540  4)    13.16
RSS 2 (540  4) 2023.6 / 536
Fcrit,0.1% ( 2,500)  7.00
The critical value of F(2,500) at the 0.1% level is 7.00. The critical value of F(2,536) must be
lower, so we reject H0 and conclude that the parental education variables do have
significant joint explanatory power.
© Christopher Dougherty 1999–2006 41
F TESTS OF GOODNESS OF FIT

Y  1   2 X 2   3 X 3  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

This sequence will conclude by showing that t tests are equivalent to marginal F tests when
the additional group of variables consists of just one variable.

© Christopher Dougherty 1999–2006 42


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2   3 X 3  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

Suppose that in the original model Y is a function of X2 and X3, and that in the revised model
X4 is added.

© Christopher Dougherty 1999–2006 43


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2   3 X 3  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 4  0
H1 :  4  0

The null hypothesis for the F test of the explanatory power of the additional ‘group’ is that
all the new slope coefficients are equal to zero. There is of course only one new slope
coefficient, 4.
© Christopher Dougherty 1999–2006 44
F TESTS OF GOODNESS OF FIT

Y  1   2 X 2   3 X 3  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 4  0
H1 :  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

The F test has the usual structure. We will illustrate it with an educational attainment model
where S depends on ASVABC and SM in the original model and on SF as well in the revised
model.
© Christopher Dougherty 1999–2006 45
F TESTS OF GOODNESS OF FIT

. reg S ASVABC SM

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 2, 537) = 147.36
Model | 1135.67473 2 567.837363 Prob > F = 0.0000
Residual | 2069.30861 537 3.85346109 R-squared = 0.3543
-------------+------------------------------ Adj R-squared = 0.3519
Total | 3204.98333 539 5.94616574 Root MSE = 1.963

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1328069 .0097389 13.64 0.000 .1136758 .151938
SM | .1235071 .0330837 3.73 0.000 .0585178 .1884963
_cons | 5.420733 .4930224 10.99 0.000 4.452244 6.389222
------------------------------------------------------------------------------

Here is the regression of S on ASVABC and SM. We make a note of the residual sum of
squares.

© Christopher Dougherty 1999–2006 46


F TESTS OF GOODNESS OF FIT

. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------

Now we add SF and again make a note of the residual sum of squares.

© Christopher Dougherty 1999–2006 47


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2   3 X 3  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 4  0
H1 :  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

( RSS1  RSS 2 ) 1 ( 2069.3  2023.6) / 1


F (1,540  4)    12.10
RSS 2 (540  4) 2023.6 / 536

The improvement on adding SF is the reduction in the residual sum of squares.

© Christopher Dougherty 1999–2006 48


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2   3 X 3  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 4  0
H1 :  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

( RSS1  RSS 2 ) 1 ( 2069.3  2023.6) / 1


F (1,540  4)    12.10
RSS 2 (540  4) 2023.6 / 536

The cost is just the single degree of freedom lost when estimating 4.

© Christopher Dougherty 1999–2006 49


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2   3 X 3  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 4  0
H1 :  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

( RSS1  RSS 2 ) 1 ( 2069.3  2023.6) / 1


F (1,540  4)    12.10
RSS 2 (540  4) 2023.6 / 536

The remaining unexplained is the residual sum of squares after adding SF.

© Christopher Dougherty 1999–2006 50


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2   3 X 3  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 4  0
H1 :  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

( RSS1  RSS 2 ) 1 ( 2069.3  2023.6) / 1


F (1,540  4)    12.10
RSS 2 (540  4) 2023.6 / 536

The number of degrees of freedom remaining after adding SF is 540 – 4 = 536.

© Christopher Dougherty 1999–2006 51


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2   3 X 3  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 4  0
H1 :  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

( RSS1  RSS 2 ) 1 ( 2069.3  2023.6) / 1


F (1,540  4)    12.10
RSS 2 (540  4) 2023.6 / 536

Hence the F statistic is 12.10.

© Christopher Dougherty 1999–2006 52


F TESTS OF GOODNESS OF FIT

Y  1   2 X 2   3 X 3  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 4  0
H1 :  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

( RSS1  RSS 2 ) 1 ( 2069.3  2023.6) / 1


F (1,540  4)    12.10
RSS 2 (540  4) 2023.6 / 536
F (1,500)crit,0.1%  10.96
The critical value of F at the 0.1% significance level with 500 degrees of freedom is 10.96.
The critical value with 536 degrees of freedom must be lower, so we reject H0 at the 0.1%
level.
© Christopher Dougherty 1999–2006 53
F TESTS OF GOODNESS OF FIT

Y  1   2 X 2   3 X 3  u RSS1
Y  1   2 X 2   3 X 3   4 X 4  u RSS 2

H0 : 4  0
H1 :  4  0

improvement cost
F(cost, d.f. remaining) =
remaining degrees of freedom
unexplained remaining

( RSS1  RSS 2 ) 1 ( 2069.3  2023.6) / 1


F (1,540  4)    12.10
RSS 2 (540  4) 2023.6 / 536
F (1,500)crit,0.1%  10.96
The null hypothesis we are testing is exactly the same as for a two-sided t test on the
coefficient of SF.

© Christopher Dougherty 1999–2006 54


F TESTS OF GOODNESS OF FIT

. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
( 2069.3  2023.6) / 1
F (1,536)   12.10 Fcrit,0.1%  10.96
2023.6 / 536

We will perform the t test. The t statistic is 3.48.

© Christopher Dougherty 1999–2006 55


F TESTS OF GOODNESS OF FIT

. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
( 2069.3  2023.6) / 1
F (1,536)   12.10 Fcrit,0.1%  10.96
2023.6 / 536
t crit,0.1%  3.31
The critical value of t at the 0.1% level with 500 degrees of freedom is 3.31. The critical
value with 536 degrees of freedom must be lower. So we reject H0 again.

© Christopher Dougherty 1999–2006 56


F TESTS OF GOODNESS OF FIT

. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
( 2069.3  2023.6) / 1
F (1,536)   12.10 Fcrit,0.1%  10.96
2023.6 / 536
3.48 2  12.11 t crit,0.1%  3.31
It can be shown that the F statistic for the F test of the explanatory power of a ‘group’ of
one variable must be equal to the square of the t statistic for that variable. (The difference
in the last digit is due to rounding error.)
© Christopher Dougherty 1999–2006 57
F TESTS OF GOODNESS OF FIT

. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
( 2069.3  2023.6) / 1
F (1,536)   12.10 Fcrit,0.1%  10.96
2023.6 / 536
3.48 2  12.11 t crit,0.1%  3.31 3.312  10.96
It can also be shown that the critical value of F must be equal to the square of the critical
value of t. (The critical values shown are for 500 degrees of freedom, but this must also be
true for 536 degrees of freedom.)
© Christopher Dougherty 1999–2006 58
F TESTS OF GOODNESS OF FIT

. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
( 2069.3  2023.6) / 1
F (1,536)   12.10 Fcrit,0.1%  10.96
2023.6 / 536
3.48 2  12.11 t crit,0.1%  3.31 3.312  10.96
Hence the conclusions of the two tests must coincide.

© Christopher Dougherty 1999–2006 59


F TESTS OF GOODNESS OF FIT

. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
( 2069.3  2023.6) / 1
F (1,536)   12.10 Fcrit,0.1%  10.96
2023.6 / 536
3.48 2  12.11 t crit,0.1%  3.31 3.312  10.96
This result means that the t test of the coefficient of a variable is a test of its marginal
explanatory power, after all the other variables have been included in the equation.

© Christopher Dougherty 1999–2006 60


F TESTS OF GOODNESS OF FIT

. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
( 2069.3  2023.6) / 1
F (1,536)   12.10 Fcrit,0.1%  10.96
2023.6 / 536
3.48 2  12.11 t crit,0.1%  3.31 3.312  10.96
If the variable is correlated with one or more of the other variables, its marginal explanatory
power may be quite low, even if it genuinely belongs in the model.

© Christopher Dougherty 1999–2006 61


F TESTS OF GOODNESS OF FIT

. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
( 2069.3  2023.6) / 1
F (1,536)   12.10 Fcrit,0.1%  10.96
2023.6 / 536
3.48 2  12.11 t crit,0.1%  3.31 3.312  10.96
If all the variables are correlated, it is possible for all of them to have low marginal
explanatory power and for none of the t tests to be significant, even though the F test for
their joint explanatory power is highly significant.
© Christopher Dougherty 1999–2006 62
F TESTS OF GOODNESS OF FIT

. reg S ASVABC SM SF

Source | SS df MS Number of obs = 540


-------------+------------------------------ F( 3, 536) = 104.30
Model | 1181.36981 3 393.789935 Prob > F = 0.0000
Residual | 2023.61353 536 3.77539837 R-squared = 0.3686
-------------+------------------------------ Adj R-squared = 0.3651
Total | 3204.98333 539 5.94616574 Root MSE = 1.943

------------------------------------------------------------------------------
S | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
ASVABC | .1257087 .0098533 12.76 0.000 .1063528 .1450646
SM | .0492424 .0390901 1.26 0.208 -.027546 .1260309
SF | .1076825 .0309522 3.48 0.001 .04688 .1684851
_cons | 5.370631 .4882155 11.00 0.000 4.41158 6.329681
------------------------------------------------------------------------------
( 2069.3  2023.6) / 1
F (1,536)   12.10 Fcrit,0.1%  10.96
2023.6 / 536
3.48 2  12.11 t crit,0.1%  3.31 3.312  10.96
If this is the case, the model is said to be suffering from the problem of multicollinearity
discussed in the previous sequence.

© Christopher Dougherty 1999–2006 63


Copyright Christopher Dougherty 1999–2006. This slideshow may be freely copied for
personal use.

© Christopher Dougherty 1999–2006 22.08.06

You might also like