ch8 Multiple Regression Analysis
ch8 Multiple Regression Analysis
Chapter 8
Example: Gujarati 5e, Table 6-4
Dependent Variable: CM
Method: Least Squares
Sample: 1 64
Included observations: 64
Variable Coefficient Std. Error t-Statistic Prob.
C 263.6416 11.59318 22.74109 0.0000
PGNP -0.005647 0.002003 -2.818703 0.0065
FLR -2.231586 0.209947 -10.62927 0.0000
R-squared 0.707665 Mean dependent var 141.5000
Adjusted R-squared 0.698081 S.D. dependent var 75.97807
S.E. of regression 41.74780 Akaike info criterion 10.34691
Sum squared resid 106315.6 Schwarz criterion 10.44811
Log likelihood -328.1012 F-statistic 73.83254
Durbin-Watson stat 2.186159 Prob(F-statistic) 0.000000
09/08/20 Prepared by Sri Yani K 2
Hypothesis testing about individual partial
regression coefficient
If we assume that ui ~ N(0, 2), we can use t test
to test a hypothesis about any individual partial
regression coefficient
ˆi i
t-statistic: t
se ˆ i
F
R R number of new regressor
2
new
2
old
1 R
2
new n number of parameters in the new model
Competing models involving the same
dependent variable but with different explanatory
variables
t
ˆ ˆ
3 4 3 4
ˆ ˆ
3 4
se ˆ ˆ
3 4 var ˆ var ˆ 2 cov ˆ , ˆ
3 4 3 4
t
ˆ2 ˆ3 2 3
ˆ2 ˆ3 1
se ˆ2 ˆ3
var ˆ2 var ˆ3 2cov ˆ2 , ˆ3
if t > t-table, reject H 0
RSS R RSSUR m R UR
2 2
m ˆ
u ˆ
u
F
RSSUR n k UR n k
ˆ
u 2
F
2
RUR RR2 m
1 RUR
2
nk
m = number of linear restriction
k = number of parameters in the unresticted regressions
n = number of observations