hjgh
hjgh
Chapter - 03
------------------------
3.7
A) summary statistics
-------------+---------------------------------------------------------
. B) simple regression
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
t-value: 2.03
p-value: 0.044
R-squared: 0.0301
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
Each additional year with the company increases salary by approximately £253.52.
t-value: 0.55
p-value: 0.586
Not statistically significant (p-value > 0.05), indicating no meaningful relationship between
years with the company and salary.
R-squared: 0.0022
____________________________________________________________________________________
_________________________________________________________________
Chapter: 04
-------------
4.1:
(A)
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
(B)
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
(C)
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
(E)
wald test
( 1) - 2*faminc + cigs = 0
F( 1, 1385) = 42.35
Conclusion: Both family income and cigarette consumption significantly affect birth
weight.
Family income positively impacts birth weight, whereas cigarette consumption has a
negative impact.
The relationship between these variables is complex and not directly proportional.
4.2
(A)
g lnwage=log(wage)
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
(B)
wald test
( 1) - educ + exper = 0
F( 1, 896) = 95.74
(C)
redundant test
. test exper = 0
( 1) exper = 0
F( 1, 896) = 20.10
(D)
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
Conclusion: Education, experience, and tenure all positively and significantly influence
wages.
The coefficients for education and experience are distinct and contribute significantly to
explaining wage variations.
4.3
(A)
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
(B)
. test lnY=0
( 1) lnY = 0
F( 1, 23) = 780.62
(C)
. test lnY=1
( 1) lnY = 1
F( 1, 23) = 26.79
The interest rate does not have a statistically significant impact on disposable income in
this model.
____________________________________________________________________________________
_____________________________________________________________________
Chapter: 05
------------
5.1
------------------------------------------------------------------------------
log_Imports | Coefficient Std. err. t P>|t| [95% conf. interval]
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
(obs=75)
-------------+---------------------------
log_Imports | 1.0000
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
. vif
-------------+----------------------
-------------+----------------------
with GDP having a larger impact. The high R-squared values in the regressions
indicate that these models explain a substantial portion of the variance in imports.
The strong correlations among the variables suggest that they are interrelated,
which is further confirmed by the significant regression coefficients. The VIF analysis
shows no multicollinearity issues,
5.2
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
(obs=23)
-------------+---------------------------
log_Imports | 1.0000
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
. vif
-------------+----------------------
-------------+----------------------
5.3
. gen log_M4 =( M4 )
. gen log_Y =( Y )
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
. reg log_M4 log_Y log_R1 log_R2
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
(obs=38)
-------------+------------------------------------
log_M4 | 1.0000
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
. vif
-------------+----------------------
-------------+----------------------
The regression analysis shows that `log_Y` is a significant positive predictor of `log_M4`,
while `log_R1` has a significant negative relationship. Including `log_R2` in the model
slightly improves the fit, but both `log_R1` and `log_R2` remain significant negative
predictors of `log_M4`. The model explains 97.52% of the variance in `log_M4`, indicating
a strong fit. The correlation matrix shows moderate correlations between `log_R1` and
`log_R2`, but the VIF values indicate no multicollinearity issues.
____________________________________________________________________________________
_______________________________________________________________________
Chapter 06
-----------
6.1
Step 1: Run Regression
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
H0: Homoskedasticity
chi2(2) = 16.14
--------------------------------------------------
Source | chi2 df p
---------------------+----------------------------
---------------------+----------------------------
--------------------------------------------------
As the P value is significant at 5% therefore the null hyp. is rejected which means
heteroskedasticity is present.
Step 3: Perform GLS Estimation and Check for Heteroskedasticity
H0: Homoskedasticity
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
-------------------------------------------------
Source | chi2 df p
---------------------+----------------------------
---------------------+----------------------------
--------------------------------------------------
chi2(2) = 17.34
As the P value is significant at 5% therefore the null hyp. is rejected which means
heteroskedasticity is present.
H0: Homoskedasticity
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
--------------------------------------------------
Source | chi2 df p
---------------------+----------------------------
---------------------+----------------------------
--------------------------------------------------
.
chi2(1) = 2.79
As the P value is insignificant at 5% therefore the null hyp is accepted which means
heteroskedasticity is not present.
Conclusion:
GLS Case (a):This suggests that the chosen weighting scheme did not effectively address
the heteroskedasticity.
6.2
Exercise 6.2
Result:
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
.279487 +
| *
n |
e |
t |* * * *
| * * * *
p | *** * *
r | * * ** *
o | * * ** * *
f | * ** * * ** * *
i | **** * ** * ** ** * *
/ | * ** *** ********** * * * * * *
s | * ** * ** * **
a | * *
l | * *
e | ** **
s | * *
-.211482 + ** * * *
+----------------------------------------------------------------+
3 No empl. 140
Result:
White's test
H0: Homoskedasticity
chi2(2) = 0.05
As the P value is insignificant at 5% therefore the null hyp is accepted which means
heteroskedasticity is not present.
Exercise 6.3:
reg Y X
Result:
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
Command: hettest
Result:
As the P value is insignificant at 5% therefore the null hyp is accepted which means
heteroskedasticity is not present.
White Test
H0: Homoskedasticity
chi2(2) = 3.81
As the P value is insignificant at 5% therefore the null hyp is accepted which means
heteroskedasticity is not present.
6.4
Exercise 6.4
Result:
Source | SS df MS Number of obs = 706
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
Step 2:-
As the P value is insignificant at 5% therefore the null hyp is accepted which means
heteroskedasticity is not present.
Command:
Result:
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
male | -30234.55 27093.98 -1.12 0.265 -83429.22 22960.13
------------------------------------------------------------------------------
6.5
Result:
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
Command: hettest
chi2(1) = 20.55
As the P value is significant at 5% therefore the null hyp. is rejected which means
heteroskedasticity is present.
____________________________________________________________________________________
_____________________________________________________________________
Chapter : 07
7.1
Solution: -
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
gen time=_n
tsset time
dwstat
Durbin–Watson d-statistic( 3, 30) = .852153
Step: Resolve
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
R | -.2957754 .078671 -3.76 0.001 -.4574859 -.1340649
-------------+----------------------------------------------------------------
rho | .6146382
------------------------------------------------------------------------------
7.2
reg Q P F R
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
P | .3162285 .0876154 3.61 0.001 .1361326 .4963245
------------------------------------------------------------------------------
gen time=_n
tsset time
dwstat
As the result of Durbin Watson test is close to 2, now we need to check the severity of
autocorrelation.
bgodfrey, lags(1)
-------------+-------------------------------------------------------------
1 | 0.044 1 0.8331
---------------------------------------------------------------------------
The result shows that the value of P is insignificant and null hypothesis is accepted.
____________________________________________________________________________________
_______________________________________________________________
Chapter: 08
------------
8.1
Step 1
. summarize
-------------+---------------------------------------------------------
step 2
. reg wage iq
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
step 3
. display iq*10
740
step 4
step 5
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
------------------------------------------------------------------------------
. display lniq*10
43.040652
Comments
- Summary Statistics**: The summary shows the mean, standard deviation, minimum, and
maximum values of the variables "iq" and "wage".
- Transformation : The variable "iq" is transformed into its natural logarithm, denoted as
"lniq".
- Regression with Transformed Variable: The regression of "wage" on the natural logarithm
of "iq" also shows a significant relationship (p < 0.05). For every 10% increase in IQ, there's
a corresponding increase of approximately 154.53 in wage, holding other variables
constant.
____________________________________________________________________________________
_________________________________________________________________
Chapter -21
Panel Data
question by mysef
. xtset id time
Delta: 1 unit
. xtreg Y X E, fe
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
-------------+----------------------------------------------------------------
sigma_u | .52193716
sigma_e | 2.6826443
------------------------------------------------------------------------------
F test that all u_i=0: F(7, 310) = 1.23 Prob > F = 0.2843
. xtreg Y X E, re
------------------------------------------------------------------------------
-------------+----------------------------------------------------------------
-------------+----------------------------------------------------------------
sigma_u | 0
sigma_e | 2.6826443
------------------------------------------------------------------------------
. est store fe
. est store re
. hausman fe re
Note: the rank of the differenced variance matrix (0) does not equal the number of
coefficients
being tested (2); be sure this is what you expect, or there may be problems computing
the
test. Examine the output of your estimators for anything unexpected and possibly
consider
scaling your variables so that the coefficients are on a similar scale.
-------------+----------------------------------------------------------------
X | .4966464 .4966464 0 0
E | 1.940393 1.940393 0 0
------------------------------------------------------------------------------
chi2(0) = (b-B)'[(V_b-V_B)^(-1)](b-B)
= 0.00
"X" and "E" on "Y", but the Hausman test doesn't provide conclusive evidence due to the
nature of the differenced variance matrix.