0% found this document useful (0 votes)
7 views

chap 3 Multiple regression

Chapter 13 focuses on multiple regression analysis, emphasizing its application in business decision-making and the interpretation of computer output. It outlines the multiple regression model, its assumptions, and the properties of the ordinary least squares (OLS) estimator, including the estimation of error variance. The chapter also provides practical examples and guidance on using software tools for regression analysis.

Uploaded by

giahanag2004
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

chap 3 Multiple regression

Chapter 13 focuses on multiple regression analysis, emphasizing its application in business decision-making and the interpretation of computer output. It outlines the multiple regression model, its assumptions, and the properties of the ordinary least squares (OLS) estimator, including the estimation of error variance. The chapter also provides practical examples and guidance on using software tools for regression analysis.

Uploaded by

giahanag2004
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Chapter 13 13-1

Econometrics

Chapter 3

Multiple Regression

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-1

Chapter Goals
After completing this chapter, you should be able to:
 Apply multiple regression analysis to business decision-
making situations
 Analyze and interpret the computer output for a multiple
regression model
 Perform a hypothesis test for all regression coefficients
or for a subset of coefficients
 Fit and interpret nonlinear regression models
 Incorporate qualitative variables into the regression
model by using dummy variables
 Discuss model specification and analyze residuals

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-2

The Multiple Regression


Model

Idea: Examine the linear relationship between


1 dependent (Y) & 2 or more independent variables (X i)

Multiple Regression Model with k Independent Variables:

Y-intercept Population slopes Random Error

Y  β0  β1X1  β2 X2    βk Xk  ε

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-3

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-2

Multiple Regression Equation

The coefficients of the multiple regression model are


estimated using sample data

Multiple regression equation with k independent variables:


Estimated Estimated
(or predicted) Estimated slope coefficients
intercept
value of y

yˆ i  b0  b1x1i  b2 x 2i    bk xki
In this chapter we will always use a computer to obtain the
regression slope coefficients and other regression
summary measures.
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-4

Multiple Regression Equation


(continued)
Two variable model
y
yˆ  b0  b1x1  b2 x 2

x2

x1
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-5

Multiple Regression
 Motivation for multiple regression
 Incorporate more explanatory factors into the model
 Explicitly hold fixed other factors that otherwise
would be in
 Allow for more flexible functional forms
 Example: Wage equation
Now measures effect of education explicitly holding experience fixed

All other factors…

Hourly wage Years of education Years of labor market experience

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-3

Multiple Regression
 Example: Family income and family consumption

Other factors

Family consumption Family income Family income squared

 Model has two explanatory variables: inome and


income squared
 Consumption is explained as a quadratic function of
income
 One has to be very careful when interpreting the
coefficients:
By how much does consumption Depends on how
increase if income is increased much income is
by one unit? already there

Standard Multiple Regression


Assumptions

 As. MLR.1: The model has to be linear in the


parameters (not in the variables).
 As. MLR.2: The data is a random sample drawn
from the population
 As. MLR.3: The values xi and the error terms εi
are independent: cov(εi, xi) = 0
 As. MLR.4: The error terms are random variables
with mean 0 and a constant variance, 2.
E[εi ]  0 and E[εi2 ]  σ 2 for (i  1, ,n)
(The constant variance property is called
homoscedasticity)
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-8

Standard Multiple Regression


Assumptions
(continued)

 As. MLR.5: The random error terms, εi , are


not correlated with one another, so that

cov(ε i ,ε j ) = E[ε i ε j ]  0 for all i  j


 As. MLR.6: It is not possible to find a set of
numbers, c0, c1, . . . , ck, such that

c 0  c1x1i  c 2 x 2i    cK xKi  0
(This is the property of no perfect multicollinearity of
Xj’s)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-9

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-4

Standard Multiple Regression


Assumptions
(continued)

 As. MLR.7: The regression model is correctly


specified. Alternatively, there is no
specification bias or error in the model.
 No omitted relevant variables;
 No irrelevant variables.
 No specification error.
 As. MLR.8: the number of observations n must
be greater than the number of explanatory
variables.
 As. MLR.9: Variability in X values.
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-10

Properties of OLS estimator

 The Gauss-Markov theorem:


 Given the assumptions of the classical linear
regression model, the least-squares estimators, in
the class of unbiased linear estimators, have
minimum variance, that is, they are BLUE.
 BLUE (best linear unbiased estimator):
 OLS estimator is a linear function of a random
variable, such as the dependent variable Y.
 OLS estimator is unbiased, that is, its average or
expected value, E(bk ), is equal to the true value, βk.
2

 It has minimum variance.

Example:
2 Independent Variables
 A distributor of frozen desert pies wants to
evaluate factors thought to influence demand

 Dependent variable: Pie sales (units per week)


 Independent variables: Price (in $)
Advertising ($100’s)

 Data are collected for 15 weeks

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-12

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-5

Pie Sales Example


Pie Price Advertising
Week Sales ($) ($100s) Multiple regression equation:
1 350 5.50 3.3
2 460 7.50 3.3
Sales = b0 + b1 (Price)
3 350 8.00 3.0
4 430 8.00 4.5 + b2 (Advertising)
5 350 6.80 3.0
6 380 7.50 4.0
7 430 4.50 3.0
8 470 6.40 3.7
9 450 7.00 3.5
10 490 5.00 4.0
11 340 7.20 3.5
12 300 7.90 3.2
13 440 5.90 4.0
14 450 5.00 3.5
15 300 7.00 2.7
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-13

Estimating a Multiple Linear


Regression Equation
 Excel will be used to generate the coefficients
and measures of goodness of fit for multiple
regression

 Excel:
 Tools / Data Analysis... / Regression
 PHStat:
 PHStat / Regression / Multiple Regression…

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-14

Multiple Regression Output


Regression Statistics
Multiple R 0.72213
R Square 0.52148
Adjusted R Square 0.44172
Standard Error 47.46341 Sales  306.526 - 24.975(Pri ce)  74.131(Adv ertising)
Observations 15

ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-15

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-6

The Multiple Regression Equation

Sales  306.526 - 24.975(Pri ce)  74.131(Adv ertising)


where
Sales is in number of pies per week
Price is in $
Advertising is in $100’s.
b1 = -24.975: sales b2 = 74.131: sales will
will decrease, on increase, on average,
average, by 24.975 by 74.131 pies per
pies per week for week for each $100
each $1 increase in increase in
selling price, net of advertising, net of the
the effects of changes effects of changes
due to advertising due to price

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-16

Coefficient of Determination, R2

 Reports the proportion of total variation in y


explained by all x variables taken together

SSR regression sum of squares


R2  
SST total sum of squares
 This is the ratio of the explained variability to
total sample variability
 General remark on R-squared
 Even if R-squared is small, regression may still
provide good estimates of ceteris paribus effects
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-17

Coefficient of Determination, R2
(continued)
Regression Statistics
SSR 29460.0
Multiple R 0.72213
R  2
  .52148
R Square 0.52148 SST 56493.3
Adjusted R Square 0.44172
Standard Error 47.46341 52.1% of the variation in pie sales
Observations 15 is explained by the variation in
price and advertising
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-18

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-7

Estimation of Error Variance


 Consider the population regression model
Yi  β0  β1x1i  β2 x 2i    βK xKi  ε i

 The unbiased estimate of the variance of the errors is


n

e 2
i
SSE
s 2 i1

n  K 1 n  K 1
e

where ei  yi  yˆ i

 The square root of the variance, s e , is called the


standard error of the estimate
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-19

Standard Error, se
Regression Statistics
Multiple R 0.72213
R Square 0.52148
s e  47.463
Adjusted R Square 0.44172
The magnitude of this
Standard Error 47.46341
value can be compared to
Observations 15
the average y value

ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-20

Adjusted Coefficient of
Determination, R 2
 R2 never decreases when a new X variable is
added to the model, even if the new variable is
not an important predictor variable
 This can be a disadvantage when comparing
models
 What is the net effect of adding a new variable?
 We lose a degree of freedom when a new X
variable is added
 Did the new X variable add enough
explanatory power to offset the loss of one
degree of freedom?
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-21

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-8

Adjusted Coefficient of
Determination, R 2
(continued)
 Used to correct for the fact that adding non-relevant
independent variables will still reduce the error sum of
squares
SSE / (n  K  1)
R 2  1
SST / (n  1)
(where n = sample size, K = number of independent variables)

 Adjusted R2 provides a better comparison between


multiple regression models with different numbers of
independent variables
 Penalize excessive use of unimportant independent
variables
 Smaller than R2
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-22

R2
Regression Statistics
Multiple R 0.72213 R 2  .44172
R Square 0.52148
Adjusted R Square 0.44172
44.2% of the variation in pie sales is
Standard Error 47.46341
explained by the variation in price and
Observations 15 advertising, taking into account the sample
size and number of independent variables
ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-23

Coefficient of Multiple
Correlation
 The coefficient of multiple correlation is the correlation
between the predicted value and the observed value of
the dependent variable

R  r(yˆ , y)  R 2
 Is the square root of the multiple coefficient of
determination
 Used as another measure of the strength of the linear
relationship between the dependent variable and the
independent variables
 Comparable to the correlation between Y and X in
simple regression
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-24

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-9

Evaluating Individual
Regression Coefficients

 Use t-tests for individual coefficients


 Shows if a specific independent variable is
conditionally important
 Hypotheses:
 H0: βj = 0 (no linear relationship)
 H1: βj ≠ 0 (linear relationship does exist
between xj and y)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-25

Evaluating Individual
Regression Coefficients
(continued)

H0: βj = 0 (no linear relationship)


H1: βj ≠ 0 (linear relationship does exist
between xi and y)

Test Statistic:

bj  0
t (df = n – k – 1)
Sb j
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-26

Evaluating Individual
Regression Coefficients
(continued)
Regression Statistics
Multiple R 0.72213
t-value for Price is t = -2.306, with
R Square 0.52148
p-value .0398
Adjusted R Square 0.44172
Standard Error 47.46341 t-value for Advertising is t = 2.855,
Observations 15 with p-value .0145

ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-27

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-10

Example: Evaluating Individual


Regression Coefficients
From Excel output:
H 0 : βj = 0
Coefficients Standard Error t Stat P-value
H 1 : βj  0 Price -24.97509 10.83213 -2.30565 0.03979
Advertising 74.13096 25.96732 2.85478 0.01449
d.f. = 15-2-1 = 12
 = .05 The test statistic for each variable falls
t12, .025 = 2.1788 in the rejection region (p-values < .05)
Decision:
/2=.025 /2=.025 Reject H0 for each variable
Conclusion:
There is evidence that both
Reject H0 Do not reject H0 Reject H0
-tα/2 tα/2 Price and Advertising affect
0
-2.1788 2.1788 pie sales at  = .05
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-28

Confidence Interval Estimate


for the Slope
Confidence interval limits for the population slope βj

b j  t nK 1,α/2Sb j where t has


(n – K – 1) d.f.

Coefficients Standard Error


Intercept 306.52619 114.25389 Here, t has
Price -24.97509 10.83213
(15 – 2 – 1) = 12 d.f.
Advertising 74.13096 25.96732

Example: Form a 95% confidence interval for the effect of


changes in price (x1) on pie sales:
-24.975 ± (2.1788)(10.832)
So the interval is -48.576 < β1 < -1.374
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-29

Confidence Interval Estimate


for the Slope
(continued)
Confidence interval for the population slope βi

Coefficients Standard Error … Lower 95% Upper 95%


Intercept 306.52619 114.25389 … 57.58835 555.46404
Price -24.97509 10.83213 … -48.57626 -1.37392
Advertising 74.13096 25.96732 … 17.55303 130.70888

Example: Excel output also reports these interval endpoints:


Weekly sales are estimated to be reduced by between 1.37 to
48.58 pies for each increase of $1 in the selling price

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-30

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-11

Test on All Coefficients


 F-Test for Overall Significance of the Model
 Shows if there is a linear relationship between all
of the X variables considered together and Y
 Use F test statistic
 Hypotheses:
H0: β1 = β2 = … = βk = 0 (no linear relationship)
H1: at least one βi ≠ 0 (at least one independent
variable affects Y)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-31

F-Test for Overall Significance


 Test statistic:
MSR SSR/K
F 
2
se SSE/(n  K  1)

where F has k (numerator) and


(n – K – 1) (denominator)
degrees of freedom
 The decision rule is

Reject H0 if F  Fk,nK 1,α


Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-32

F-Test for Overall Significance


(continued)
Regression Statistics
Multiple R 0.72213
MSR 14730.0
R Square 0.52148
F   6.5386
Adjusted R Square 0.44172
MSE 2252.8
Standard Error 47.46341
With 2 and 12 degrees P-value for
Observations 15
of freedom the F-Test

ANOVA df SS MS F Significance F
Regression 2 29460.027 14730.013 6.53861 0.01201
Residual 12 27033.306 2252.776
Total 14 56493.333

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 306.52619 114.25389 2.68285 0.01993 57.58835 555.46404
Price -24.97509 10.83213 -2.30565 0.03979 -48.57626 -1.37392
Advertising 74.13096 25.96732 2.85478 0.01449 17.55303 130.70888

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-33

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-12

F-Test for Overall Significance


(continued)

H0: β1 = β2 = 0 Test Statistic:


H1: β1 and β2 not both zero MSR
F  6.5386
 = .05 MSE
df1= 2 df2 = 12
Decision:
Critical Since F test statistic is in
Value: the rejection region (p-
F = 3.885 value < .05), reject H0
 = .05
Conclusion:
0 F There is evidence that at least one
Do not Reject H0
reject H0 independent variable affects Y
F.05 = 3.885
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-34

Tests on a Subset of
Regression Coefficients

 Consider a multiple regression model involving


variables xj and zj , and the null hypothesis that the z
variable coefficients are all zero:

yi  β0  β1x1i    βK xKi  α1z1i  αr zri  ε i

H0 : α1  α2    αr  0
H1 : at least one of α j  0 (j  1,...,r)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-35

Tests on a Subset of
Regression Coefficients
(continued)

 Goal: compare the error sum of squares for the


complete model with the error sum of squares for the
restricted model
 First run a regression for the complete model and obtain SSE
 Next run a restricted regression that excludes the z variables
(the number of variables excluded is r) and obtain the
restricted error sum of squares SSE(r)
 Compute the F statistic and apply the decision rule for a
significance level 

( SSE(r)  SSE ) / r
Reject H0 if F   Fr,nK r 1,α
s2e
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-36

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-13

Prediction
 Given a population regression model

yi  β0  β1x1i  β2 x 2i    βK xKi  ε i (i  1,2,,n)

 then given a new observation of a data point


(x1,n+1, x 2,n+1, . . . , x K,n+1)
the best linear unbiased forecast of y^n+1 is

yˆ n1  b0  b1x1,n1  b2 x 2,n1    bK xK,n1


 It is risky to forecast for new X values outside the range of the data used
to estimate the model coefficients, because we do not have data to
support that the linear model extends beyond the observed range.

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-37

Using The Equation to Make


Predictions
Predict sales for a week in which the selling
price is $5.50 and advertising is $350:

Sales  306.526 - 24.975(Price)  74.131(Adv ertising)


 306.526 - 24.975 (5.50)  74.131 (3.5)
 428.62

Note that Advertising is


Predicted sales in $100’s, so $350
means that X2 = 3.5
is 428.62 pies

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-38

Predictions in PHStat
 PHStat | regression | multiple regression …

Check the
“confidence and
prediction interval
estimates” box

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-39

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-14

Predictions in PHStat
(continued)

Input values

<
Predicted y value
Confidence interval for the

<
mean y value, given
these x’s

Prediction interval for an


<
individual y value, given
these x’s
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-40

Residuals in Multiple Regression


Two variable model
y Sample
yi observation yˆ  b0  b1x1  b2 x 2
Residual =
<

ei = (yi – yi)
<

yi

x2i
x2
x1i

x1
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-41

Nonlinear Regression Models


 The relationship between the dependent
variable and an independent variable may
not be linear
 Can review the scatter diagram to check for
non-linear relationships
 Example: Quadratic model
Y  β0  β1X1  β 2 X12  ε
 The second independent variable is the square
of the first variable

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-42

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-15

Quadratic Regression Model


Model form:

Yi  β0  β1X1i  β 2 X1i2  ε i
 where:
β0 = Y intercept
β1 = regression coefficient for linear effect of X on Y
β2 = regression coefficient for quadratic effect on Y
εi = random error in Y for observation i

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-43

Linear vs. Nonlinear Fit

Y Y

X X
residuals

residuals

X X

Linear fit does not give Nonlinear fit gives


random residuals
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.
 random residuals
Chap 13-44

Quadratic Regression Model


Yi  β0  β1X1i  β 2 X1i2  ε i
Quadratic models may be considered when the scatter
diagram takes on one of the following shapes:
Y Y Y Y

X1 X1 X1 X1
β1 < 0 β1 > 0 β1 < 0 β1 > 0
β2 > 0 β2 > 0 β2 < 0 β2 < 0
β1 = the coefficient of the linear term
β2 = the coefficient of the squared term

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-45

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-16

Testing for Significance:


Quadratic Effect
 Testing the Quadratic Effect
 Compare the linear regression estimate
yˆ  b 0  b1x1
 with quadratic regression estimate
yˆ  b0  b1x1  b 2 x12

 Hypotheses
 H0: β2 = 0 (The quadratic term does not improve the model)
 H1: β2  0 (The quadratic term improves the model)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-46

Testing for Significance:


Quadratic Effect
(continued)

 Testing the Quadratic Effect


Hypotheses
 H0: β2 = 0 (The quadratic term does not improve the model)
 H1: β2  0 (The quadratic term improves the model)

 The test statistic is

b2  β2
where:

t b2 = squared term slope


coefficient
sb 2 β2 = hypothesized slope (zero)
Sb = standard error of the slope
d.f.  n  3 2

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-47

Testing for Significance:


Quadratic Effect
(continued)

 Testing the Quadratic Effect

Compare R2 from simple regression to


R2 from the quadratic model

 If R2 from the quadratic model is larger than


R2 from the simple model, then the
quadratic model is a better model

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-48

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-17

Example: Quadratic Model

Purity
Filter
Time
 Purity increases as filter time
3 1 increases:
7 2 Purity vs. Time
8 3
100
15 5
22 7 80
33 8
40 10 60
Purity

54 12
40
67 13
70 14
20
78 15
85 15 0
87 16 0 5 10 15 20

99 17 Time

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-49

Example: Quadratic Model


(continued)
 Simple regression results:
y^ = -11.283 + 5.985 Time

Standard
Coefficients Error t Stat P-value
t statistic, F statistic, and
Intercept -11.28267 3.46805 -3.25332 0.00691 R2 are all high, but the
Time 5.98520 0.30966 19.32819 2.078E-10 residuals are not random:
Regression Statistics Time Residual Plot
F Significance F
R Square 0.96888
373.57904 2.0778E-10
10
Adjusted R Square 0.96628
Residuals

Standard Error 6.15997


5
0
-5 0 5 10 15 20
-10
Time
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-50

Example: Quadratic Model


(continued)
 Quadratic regression results:
y^ = 1.539 + 1.565 Time + 0.245 (Time) 2
Standard Time Residual Plot
Coefficients Error t Stat P-value
10
Intercept 1.53870 2.24465 0.68550 0.50722
Residuals

5
Time 1.56496 0.60179 2.60052 0.02467
Time-squared 0.24516 0.03258 7.52406 1.165E-05 0
0 5 10 15 20
-5
Regression Statistics Time
F Significance F
R Square 0.99494
1080.7330 2.368E-13 Time-squared Residual Plot
Adjusted R Square 0.99402
10
Standard Error 2.59513
Residuals

The quadratic term is significant and 0


0 100 200 300 400
improves the model: R2 is higher and se is -5
Time-squared
lower, residuals are now random
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-51

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-18

The Log Transformation

The Multiplicative Model:


 Original multiplicative model

Y  β0 X1β1 Xβ22 ε

 Transformed multiplicative model

log(Y)  log(β0 )  β1log(X1 )  β2log(X 2 )  log(ε)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-52

Interpretation of coefficients
For the multiplicative model:
log Yi  log β0  β1 log X1i  log εi

 When both dependent and independent


variables are logged:
 The coefficient of the independent variable Xk can
be interpreted as
a 1 percent change in Xk leads to an estimated bk
percentage change in the average value of Y
 bk is the elasticity of Y with respect to a change in Xk

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-53

Dummy Variables

 A dummy variable is a categorical independent


variable with two levels:
 yes or no, on or off, male or female
 recorded as 0 or 1
 Regression intercepts are different if the variable
is significant
 Assumes equal slopes for other variables
 If more than two levels, the number of dummy
variables needed is (number of levels - 1)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-54

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-19

Dummy Variable Example

yˆ  b0  b1x1  b2 x 2

Let:
y = Pie Sales
x1 = Price
x2 = Holiday (X2 = 1 if a holiday occurred during the week)
(X2 = 0 if there was no holiday that week)

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-55

Dummy Variable Example


(continued)

yˆ  b0  b1x1  b 2 (1)  (b0  b2 )  b1x1 Holiday

yˆ  b0  b1x1  b 2 (0)  b0  b1x1 No Holiday

Different Same
intercept slope
y (sales)
If H0: β2 = 0 is
b0 + b2 rejected, then
b0 “Holiday” has a
significant effect
on pie sales

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc.


x1 (Price) Chap 13-56

Interpreting the
Dummy Variable Coefficient
Example: Sales  300 - 30(Price)  15(Holiday)
Sales: number of pies sold per week
Price: pie price in $
1 If a holiday occurred during the week
Holiday:
0 If no holiday occurred

b2 = 15: on average, sales were 15 pies greater in


weeks with a holiday than in weeks without a
holiday, given the same price

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-57

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-20

Interaction Between
Explanatory Variables

 Hypothesizes interaction between pairs of x


variables
 Response to one x variable may vary at different
levels of another x variable

 Contains two-way cross product terms

 yˆ  b0  b1x1  b2 x 2  b3 x 3
 b0  b1x1  b2 x 2  b3 (x1x 2 )
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-58

Effect of Interaction

 Given: Y  β  β X  (β  β X )X
0 2 2 1 3 2 1

 β0  β1X1  β2 X2  β3 X1X2

 Without interaction term, effect of X1 on Y is


measured by β1
 With interaction term, effect of X1 on Y is
measured by β1 + β3 X2
 Effect changes as X2 changes

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-59

Interaction Example
Suppose x2 is a dummy variable and the estimated
regression equation is yˆ  1 2x1  3x 2  4x1x 2
y
12
x2 = 1:
8 y^ = 1 + 2x1 + 3(1) + 4x1(1) = 4 + 6x1

4 x2 = 0:
^y = 1 + 2x + 3(0) + 4x (0) = 1 + 2x
1 1 1
0
x1
0 0.5 1 1.5
Slopes are different if the effect of x1 on y depends on x2 value
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-60

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-21

Significance of Interaction Term


 The coefficient b3 is an estimate of the difference
in the coefficient of x1 when x2 = 1 compared to
when x2 = 0
 The t statistic for b3 can be used to test the
hypothesis
H0 : β3  0 | β1  0, β2  0
H1 : β3  0 | β1  0, β2  0
 If we reject the null hypothesis we conclude that there is
a difference in the slope coefficient for the two
subgroups

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-61

Multiple Regression Assumptions

Errors (residuals) from the regression model:


<

ei = (yi – yi)

Assumptions:
 The errors are normally distributed
 Errors have a constant variance
 The model errors are independent

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-62

Analysis of Residuals
in Multiple Regression

 These residual plots are used in multiple


regression:
<

 Residuals vs. yi
 Residuals vs. x1i
 Residuals vs. x2i
 Residuals vs. time (if time series data)
Use the residual plots to check for
violations of regression assumptions

Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-63

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.
Chapter 13 13-22

Chapter Summary
 Developed the multiple regression model
 Tested the significance of the multiple regression model
 Discussed adjusted R2 ( R2 )
 Tested individual regression coefficients
 Tested portions of the regression model
 Used quadratic terms and log transformations in
regression models
 Used dummy variables
 Evaluated interaction effects
 Discussed using residual plots to check model
assumptions
Statistics for Business and Economics, 6e © 2007 Pearson Education, Inc. Chap 13-64

Statistics for Business and Economics, 6/e © 2007 Pearson Education, Inc.

You might also like