0% found this document useful (0 votes)
8 views

Tutorial Session 9 Suggested Solution

Uploaded by

lucastone325
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

Tutorial Session 9 Suggested Solution

Uploaded by

lucastone325
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Econ 314: Quantitative Economics

Tutorial 8- Multicolliniarity

Please print and attempt prior to the tutorial session on 29 September.

1. What is the difference between perfect and imperfect Multicolliniarity?

In perfect collinearity there is an exact linear relationship between two or more


independent variables. As a result it is not possible to estimate a regression model.

In imperfect collinearity the linear relationship between the explanatory variables is not
exact but approximate. It is possible to estimate a regression model. But if the linear
collinearity is high, there are consequences.

2. Consider the model:

Y i =B 1 + B2 X i + B3 X 2i + B4 X 3i + ui
Where: Y = the total cost of production
X = output
Since X2 and X3 are functions of X, there is perfect collinearity. Do you agree or
disagree? Explain.

Disagree. The variables X2 and X3 are nonlinear functions of X. Hence their inclusion
does not violate the assumption of the classical linear regression model (CLRM) of “no
exact linear relationship among explanatory variables”.

3.

(a) Interpretation of both regressions:

The R2 of both models are high, indicating the explanatory power of both models are
excellent. In both models the coefficients of K and H are partial elasticities and should
be interpreted as such.
The trend variable (t) in the second model should interpreted as follows: Holding other
variables constant, the index of production has been increasing at the annual rate of
about 2.7%.

(b) The likely explanation is that the trend variable (perhaps representing technology)
and log K are collinear.
(c) A high pairwise correlation is a very strong indicator for the existence of
collinearity. As previously explained, it is a sufficient (but not necessary) condition for
the presence of multicollinearity.

(d) Using the F test (the R2 variety) we need to test the overall significance of the
regression. As you see below, the regression is overall significant at an α level
beyond 1%. Hence we reject the null hypothesis that all the slope coefficients are
simultaneously equal to zero. This is a further indication that Model 2 maybe suffering
from multicollinearity.

H0 = β2= β3= β4 = 0
H1 = at least one βi is not equal to zero

R2 /(k −1) 0 . 889/( 4−1)


F calc= = =45. 3844
(1−R )/(n−k ) (1−0 .889 )/(21−4 )
2

F3,17 = 5.18 (at α = 1%)

We reject the null hypothesis. All βi are not simultaneously equal to zero.
Hence the regression is overall significant.

(e) The return to scale in Model 1: 0.887 + 0.893 = 1.780. The estimated model
suggests increasing returns to scale.

4.
Source | SS df MS Number of obs = 88
-------------+------------------------------ F( 3, 84) = 129.97
Model | 755162.538 3 251720.846 Prob > F = 0.0000
Residual | 162691.968 84 1936.80914 R-squared = 0.9227
-------------+------------------------------ Adj R-squared = 0.9164
Total | 917854.506 87 10550.0518 Root MSE = 44.009

------------------------------------------------------------------------------
price | Coef. Std. Err. t P>|t| [95% Conf. Interval]
-------------+----------------------------------------------------------------
assess | .9155515 .1052728 8.70 0.000 .706205 1.124898
lndsize | .0425625 .0468901 0.91 0.371 -.0528361 .1379612
hossize | -.1677938 .6554645 -0.26 0.799 -1.472723 1.137136
_cons | -15.85658 17.33887 -0.91 0.363 -50.33683 18.62367
------------------------------------------------------------------------------
Looking at the output results, do you suspect multicollinearity? Explain fully.

The following are indicators that multicollinearity is present in the data:


(a) A high R2 value (and statistically significant F stat) but only one statistically
significant explanatory variable. The other two explanatory variables appear to be
insignificant.
(b) On theoretical grounds, we expect a positive relationship to exist between house
size and house prices, but the estimated coefficient has a negative sign.

Further tests are therefore warranted to test for multicollinearity.

You might also like