0% found this document useful (0 votes)
110 views

ch8 Multiple Regression Analysis

This document discusses multiple regression analysis and various statistical tests that can be used for inference when estimating regression models. It covers hypothesis testing of individual regression coefficients and the overall model, comparing regression models, testing equality restrictions on coefficients, and testing different functional forms.

Uploaded by

Raditya
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
110 views

ch8 Multiple Regression Analysis

This document discusses multiple regression analysis and various statistical tests that can be used for inference when estimating regression models. It covers hypothesis testing of individual regression coefficients and the overall model, comparing regression models, testing equality restrictions on coefficients, and testing different functional forms.

Uploaded by

Raditya
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 15

Multiple regression analysis:

the problem of inference

Chapter 8
Example: Gujarati 5e, Table 6-4
Dependent Variable: CM
Method: Least Squares
Sample: 1 64
Included observations: 64
Variable Coefficient Std. Error t-Statistic Prob.
C 263.6416 11.59318 22.74109 0.0000
PGNP -0.005647 0.002003 -2.818703 0.0065
FLR -2.231586 0.209947 -10.62927 0.0000
R-squared 0.707665 Mean dependent var 141.5000
Adjusted R-squared 0.698081 S.D. dependent var 75.97807
S.E. of regression 41.74780 Akaike info criterion 10.34691
Sum squared resid 106315.6 Schwarz criterion 10.44811
Log likelihood -328.1012 F-statistic 73.83254
Durbin-Watson stat 2.186159 Prob(F-statistic) 0.000000
09/08/20 Prepared by Sri Yani K 2
Hypothesis testing about individual partial
regression coefficient
 If we assume that ui ~ N(0, 2), we can use t test
to test a hypothesis about any individual partial
regression coefficient

 H0: i = 0 and H1: i  0

ˆi  i
t-statistic: t 
 

se ˆ i

09/08/20 Prepared by Sri Yani K 3


Testing the overall significance of the
sample regression
Given the k-variable regression model:
Yi  1   2 X 2i   3 X 3i  ...   k X ki  ui
To test the hypothesis
H 0 : 1   2   3 X 3i  ...   k  0
H1 : Not all slope coefficient are simultaneously zero
F statistic:
ESS df ESS  k  1
F 
RSS df RSS  n  k 
Decision rule: if F > Fα, df(k-1; n-k) , reject H 0

09/08/20 Prepared by Sri Yani K 4


An important relationship between R2
and F

09/08/20 Prepared by Sri Yani K 5


The “incremental” contribution of an
explanatory variable
 When to add a new variable
ESS new  ESSold number of new regressor
F
RSSnew n  number of parameters in the new model
or

F
 R  R  number of new regressor
2
new
2
old

 1 R
2
new  n  number of parameters in the new model
 Competing models involving the same
dependent variable but with different explanatory
variables

09/08/20 Prepared by Sri Yani K 6


Testing the equality of two
regression coefficients
Model: Yi  1   2 X 2i  3 X 3i  4 X 4i  ui
To test the hypothesis:
H 0 : 3   4 or  3   4   0
H1 :  3   4 or  3  4   0
Test statistic:

t
 ˆ  ˆ       
3 4 3 4

 ˆ  ˆ 
3 4

se  ˆ  ˆ 
3 4 var  ˆ   var  ˆ   2 cov  ˆ , ˆ 
3 4 3 4

Decision Rule: if t > t-table, reject H 0

09/08/20 Prepared by Sri Yani K 7


Restricted Least Squares: testing
linear equality restrictions
 For instance, consider the Cobb-Douglas
Production Function:
2 3 ui
Yi  1 X X e 2i 3i
 Written in log form
ln Yi   0   2 ln X 2i  3 ln X 3i  ui
where  0  ln 1
 If there are constant returns to scale, economic
theory suggest that  2  3  1

09/08/20 Prepared by Sri Yani K 8


Restricted Least Squares: testing
linear equality restrictions
 The t test approach
H 0 :   2  3   1
H1 :   2   3   1

t
 
ˆ2  ˆ3    2  3 

 
ˆ2  ˆ3  1


se ˆ2  ˆ3     
var ˆ2  var ˆ3  2cov ˆ2 , ˆ3  
if t > t-table, reject H 0

09/08/20 Prepared by Sri Yani K 9


Restricted Least Squares: testing
linear equality restrictions
 The F test approach

 RSS R  RSSUR  m    R  UR

2 2
m ˆ
u ˆ
u
F 
RSSUR  n  k   UR  n  k 
ˆ
u 2

F
 2
RUR  RR2  m
 1  RUR
2
  nk
m = number of linear restriction
k = number of parameters in the unresticted regressions
n = number of observations

09/08/20 Prepared by Sri Yani K 10


Comparing two regression: testing for
structural regression model
 Gregory Chow (1960): Chow test
 The assumptions:
1. The two error terms are normally distributed with
the same variance
 
u1t ~ N 0,  2 and u2t ~ N 0,  2  
2. u1t and u2t are independently distributed
 Regression model
 1 Yt  1   2 X t  u1t t  1, 2,..., n1
 2 Yt  1   2 X t  u1t t  1, 2,..., n2

09/08/20 Prepared by Sri Yani K 11


Comparing two regression: testing for
structural regression model

 The Chow test proceeds as follows


1. Combining all n1 and n2 observations, estimate
and obtain its RSS, S1, with df=(n1+n2-k)
2. Estimate the models individually and obtain
their RSS, S2 and S3, with df=(n1-k) and (n2-k).
3. Add these two RSS, S4=S2+S3 with df=(n1+n2-
2k)
4. Obtain S5=S1-S4

09/08/20 Prepared by Sri Yani K 12


Comparing two regression: testing for
structural regression model
4. Given the assumptions of the Chow test, it can
be show that
S5 k
F
S4  n1  n2  2k 
if F > the critical F  reject H0 that the two
regressions are the same, that is, reject the
hypothesis of structural stability

09/08/20 Prepared by Sri Yani K 13


Testing the functional form of
regression
 MacKinnon, White, and Davidson (1983):
MWD test  Choosing between linear and
log-linear regression models

 H0: Linear model Yt  1   2 X 2t   3 X 3t  ut


H1: Log-linear model ln Yt  1   2 ln X 2t  3 ln X 3t  ut

09/08/20 Prepared by Sri Yani K 14


Testing the functional form of regression
The steps of the MWD test:
1. Estimate the linear model and obtain the estimated Y 
Yf
2. Estimate the log-linear model and obtain the estimated
lnY  ln f
3. Obtain Z1 = ln (Yf) – ln f
4. Regress Y on X’s and Z1. Reject H0 if the coefficient of
Z1 is statistically significant
5. Obtain Z2 = antilog (ln f) – Yf
6. Regress log of Y on the log of X’s and Z2. Reject H1 if
the coefficient of Z2 is statistically significant

09/08/20 Prepared by Sri Yani K 15

You might also like