LECTURE 12
Simple Linear Regression
Learning Objectives
In this lecture, you learn:
How to calculate and test a correlation coefficient
How to use regression analysis to predict the value
of a dependent variable
To compute and interpret regression coefficients
To compute and interpret coefficient of
determination
To inference about the slope
To estimate mean values and individual values
Correlation coefficient
The formula used to compute the linear correlation
coefficient:
Correlation coefficient
To simplify the notation, we define three terms called
sums of squares
The formula for the sample correlation coefficient can
be re-written:
Correlation coefficient and
scatter plots
Test for significant correlation
Step 1: State the Hypotheses
H0: = 0 (population correlation coefficient equals zero)
H1: ≠ 0
Step 2: Specify the Decision Rule
For degrees of freedom df = n -2, look up the critical value t .
Step 3: Calculate the Test Statistic
Step 4: Make the Decision compare t calc vs t critical
Reject H0 if t > t or if t < -t
reject H0 the if the p-value .
Critical value for
correlation coefficient
We can calculate the critical value for the correlation coefficient:
Significance vs. Importance
In large samples, small correlations may be
significant, even if the scatter plot shows little
evidence of linearity. Thus, a significant
(đúng but not qtrong) correlation may lack
practical importance.
Introduction to
Regression Analysis
Regression analysis is used to:
Predict the value of a dependent variable based on the value of independent
variable(s)
Explain the impact of changes in an independent variable on the dependent
variable
Dependent variable (y): the variable we wish to explain
(response variable)
Independent variable (x): the variable used to explain (predictor variable) the
dependent variable
We want to predict revenue based on marketing budget
Dependent: revenue
Independent: marketing budget
Simple Linear Regression Model
The relationship between x and y is described by a
linear function
Changes in y ( revenue) are assumed to be caused by
changes in x (marketing budget)
Linear regression population equation model
yi β 0 β1x i ε i
Where 0 and 1 are the population model
coefficients and is a random error term(other factors
affectign revenue.
Population Regression Model
Population Random
Population Independent Error
Slope
Y intercept Variable term
Coefficient
Dependent
Variable
y i β 0 β1x i ε i
Linear component Random Error
component
Population Regression Model
y y i β 0 β1x i ε i
Observed Value
of y for xi
εi Slope = β1
Predicted Value
Random Error
of y for xi
for this Xi value
Intercept = β0
Xi x
Sample Regression Model
The sample regression equation provides an
estimate of the population regression line
Estimated Estimate of Estimate of the
(or predicted) the regression regression slope
y value for
observation i intercept
Value of x for
yˆ i b0 b1x i observation i
The individual random error terms ei have a mean of zero
ei ( y i - yˆ i ) y i - (b0 b1x i )
Least Squares Estimators
b0 and b1 are obtained by finding the values
of b0 and b1 that minimize the sum of the
squared differences between y and ŷ :
min SSE min ei2
min (y i yˆ i )2
min [y i (b0 b1x i )]2
Differential calculus is used to obtain the
coefficient estimators b0 and b1 that minimize SSE
Least Squares Estimators
The slope coefficient estimator is
n
(x x)(y y)
i i
sY
b1 i 1
n
rxy
sX
i
(x
i 1
x) 2
And the constant or y-intercept is
b0 y b1x
The regression line always goes through the mean x, y
Finding the Least Squares
Equation
The coefficients b0 and b1 , and other
regression results will be found using a
computer software.
Hand calculations are tedious
Statistical routines are built into Excel
Other statistical analysis software can be used
Interpretation of the
Slope and the Intercept
b0 is the estimated average value of y
when the value of x is zero (if x = 0 is in
the range of observed x values)
b1 is the estimated change in the
average value of y as a result of a one-
unit change in x
Interpretation of the
Slope and the Intercept
Each extra dependent raises the
mean annual prescription drug
DrugCost = 410 + 550 Dependents cost by $550. An employee with
zero dependents averages $410 in
prescription drugs.
Each extra square foot adds $1.05
to monthly apartment rent. The
Rent = 150 + 1.05 SqFt
intercept is not meaningful because
no apartment can have SqFt = 0.
Cause and effect
The relationship between the explanatory variable
and response variable is not causual: one cannot
conclude that the explanatory variable causes a
change in the response variable.
Consider the following regression equation:
Crime Rate = 0.125 + 0.031 Unemployment Rate
For each unit increase in the unemployment rate, we expect
an increase of .031 in the crime rate, but this does not
mean being out of work causes crime to increase.
Simple Linear Regression
Example
A real estate agent wishes to examine the
relationship between the selling price of a home
and its size (measured in square feet)
A random sample of 10 houses is selected
Dependent variable (Y) = house price in $1000s
Independent variable (X) = square feet
Sample Data for House Price
Model
House Price in $1000s Square Feet
(Y) (X)
245 1400
312 1600
279 1700
308 1875
199 1100
219 1550
405 2350
324 2450
319 1425
255 1700
Graphical Presentation
House price model: scatter plot
450
400
House Price ($1000s)
350
300
250
200
150
100
50
0
0 500 1000 1500 2000 2500 3000
Square Feet
Regression Using Excel
Tools / Data Analysis / Regression
Excel Output
Regression Statistics
Multiple R 0.76211 The regression equation is:
R Square 0.58082
Adjusted R Square 0.52842 house price 98.24833 0.10977 (square feet)
Standard Error 41.33032
Observations 10
ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000
Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580
Graphical Presentation
House price model: scatter plot and
regression
450
line
400
House Price ($1000s)
350
Slope
300
250
= 0.10977
200
150
100
50
Intercept 0
= 98.248 0 500 1000 1500 2000 2500 3000
Square Feet
house price 98.24833 0.10977 (square feet)
Interpretation of the
Intercept, b0
house price 98.24833 0.10977 (square feet)
b0 is the estimated average value of Y when the
value of X is zero (if X = 0 is in the range of
observed X values)
Here, no houses had 0 square feet, so b0 = 98.24833
just indicates that, for houses within the range of
sizes observed, $98,248.33 is the portion of the
house price not explained by square feet
Interpretation of the
Slope Coefficient, b1
house price 98.24833 0.10977 (square feet)
b1 measures the estimated change in the
average value of Y as a result of a one-
unit change in X
Here, b1 = .10977 tells us that the average value of a
house increases by 0.10977($1000) = $109.77, on
average, for each additional one square foot of size
Measures of Variation
Total variation is made up of two parts:
SST SSR SSE
Total Sum of Regression Sum Error Sum of
Squares of Squares Squares
SST (y i y)2 SSR (yˆ i y)2 SSE (y i yˆ i )2
where:
y = Average value of the dependent variable
yi = Observed values of the dependent variable
ŷ = Predicted value of y for the given x value
i i
Measures of Variation
SST = total sum of squares
Measures the variation of the yi values around their
mean, y
SSR = regression sum of squares
Explained variation attributable to the linear
relationship between x and y
SSE = error sum of squares
Variation attributable to factors other than the linear
relationship between x and y
Measures of Variation
Y
yi
2 y
SSE = (yi - yi )
_
SST = (yi - y)2
y _2
_ SSR = (yi - y) _
y y
xi X
Coefficient of Determination, R2
The coefficient of determination is the portion
of the total variation in the dependent variable
that is explained by variation in the
independent variable
The coefficient of determination is also called
R-squared and is denoted as R2
SSR SSE
R 2
1
SST SST
note: 0 R 1
2
Examples of Approximate
r2 Values
Y
r2 = 1
Perfect linear relationship
between X and Y:
X
r2 = 1
Y 100% of the variation in Y is
explained by variation in X
X
r =1
2
Examples of Approximate
r2 Values
Y
0 < r2 < 1
Weaker linear relationships
between X and Y:
X
Some but not all of the
Y
variation in Y is explained
by variation in X
X
Examples of Approximate
r2 Values
r2 = 0
Y
No linear relationship
between X and Y:
The value of Y does not
X depend on X. (None of the
r2 = 0
variation in Y is explained
by variation in X)
Excel Output
SSR 18934.9348
Regression Statistics
R 2
0.58082
Multiple R 0.76211 SST 32600.5000
R Square 0.58082
Adjusted R Square 0.52842 58.08% of the variation in
Standard Error 41.33032
house prices is explained by
Observations 10
variation in square feet
ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000
Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580
Estimation of Model
Error Variance
An estimator for the variance of the population model
error is
n
i
e 2
SSE
σˆ s
2 2
e i1
n2 n2
Division by n – 2 is because the simple regression model uses two
estimated parameters, b0 and b1, instead of one
se s2e is called the standard error of the estimate
Excel Output
Regression Statistics
Multiple R 0.76211 s e 41.33032
R Square 0.58082
Adjusted R Square 0.52842
Standard Error 41.33032
Observations 10
ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000
Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580
Comparing Standard Errors
se is a measure of the variation of observed y
values from the regression line
Y Y
small se X large se X
The magnitude of se should always be judged relative to the size
of the y values in the sample data
i.e., se = $41.33K is moderately small relative to house prices in
the $200 - $300K range
Inferences About the
Regression Model
The variance of the regression slope coefficient
(b1) is estimated by
2 2
s s
s2b1 e
e
(xi x) (n 1)s x
2 2
where:
sb1 = Estimate of the standard error of the least squares slope
SSE
se = Standard error of the estimate
n2
Excel Output
Regression Statistics
Multiple R 0.76211
R Square 0.58082
Adjusted R Square 0.52842
Standard Error
Observations
41.33032
10
sb1 0.03297
ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000
Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580
Comparing Standard Errors of
the Slope
Sb1 is a measure of the variation in the slope of regression
lines from different possible samples
Y Y
small Sb1 X large Sb1 X
Inference about the Slope:
t Test
t test for a population slope
Is there a linear relationship between X and Y?
Null and alternative hypotheses
H0: β1 = 0 (no linear relationship)
H1: β1 0 (linear relationship does exist)
Test statistic
b1 β1 where:
t b1 = regression slope
sb1 coefficient
β1 = hypothesized slope
sb1 = standard
d.f. n 2 error of the slope
Inference about the Slope:
t Test
House Price Estimated Regression Equation:
Square Feet
in $1000s
(x)
(y) house price 98.25 0.1098 (sq.ft.)
245 1400
312 1600
279 1700
308 1875 The slope of this model is 0.1098
199 1100
219 1550
Does square footage of the house
405 2350 affect its sales price?
324 2450
319 1425
255 1700
Inferences about the Slope:
t Test Example
b1 sb1
H0: β1 = 0 From Excel output:
H1: β1 0 Coefficients Standard Error t Stat P-value
Intercept 98.24833 58.03348 1.69296 0.12892
Square Feet 0.10977 0.03297 3.32938 0.01039
b1 β1 0.10977 0
t t 3.32938
sb1 0.03297
Inferences about the Slope:
t Test Example
Test Statistic: t = 3.329
b1 sb1 t
H0: β1 = 0 From Excel output:
H1: β1 0 Coefficients Standard Error t Stat P-value
Intercept 98.24833 58.03348 1.69296 0.12892
d.f. = 10-2 = 8 Square Feet 0.10977 0.03297 3.32938 0.01039
t8,.025 = 2.3060
Decision:
/2=.025 /2=.025 Reject H0
Conclusion:
Reject H0 Do not reject H0 Reject H0
There is sufficient evidence
-tn-2,α/2 0 tn-2,α/2 that square footage affects
-2.3060 2.3060 3.329 house price
Inferences about the Slope:
t Test Example
P-value = 0.01039
P-value
H0: β1 = 0 From Excel output:
H1: β1 0 Coefficients Standard Error t Stat P-value
Intercept 98.24833 58.03348 1.69296 0.12892
Square Feet 0.10977 0.03297 3.32938 0.01039
This is a two-tail test, so Decision: P-value < α so
the p-value is Reject H0
P(t > 3.329)+P(t < -3.329) Conclusion:
= 0.01039 There is sufficient evidence
(for 8 d.f.) that square footage affects
house price
Confidence Interval Estimate
for the Slope
Confidence Interval Estimate of the Slope:
b1 t n2,α/2sb1 β1 b1 t n2,α/2sb1
d.f. = n - 2
Excel Printout for House Prices:
Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580
At 95% level of confidence, the confidence interval for
the slope is (0.0337, 0.1858)
Confidence Interval Estimate
for the Slope
Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580
Since the units of the house price variable is
$1000s, we are 95% confident that the average
impact on sales price is between $33.70 and
$185.80 per square foot of house size
This 95% confidence interval does not include 0.
Conclusion: There is a significant relationship between
house price and square feet at the .05 level of significance
Confidence intervals and test hyphotheses for
the slope and intercept
NOTE: The test for zero slope is the same as the test for zero
correlation. That is, the t test for zero slope will always yield exactly
the same tcalc as the t test for zero correlation.
F-Test for Significance
F Test statistic: MSR
F
where
MSE
SSR
MSR
k
SSE
MSE
n k 1
where F follows an F distribution with k numerator and (n – k - 1)
denominator degrees of freedom
(k = the number of independent variables in the regression model)
ANOVA Table
Excel Output
Regression Statistics
Multiple R 0.76211
MSR 18934.9348
R Square 0.58082 F 11.0848
Adjusted R Square 0.52842 MSE 1708.1957
Standard Error 41.33032
Observations 10 With 1 and 8 degrees P-value for
of freedom the F-Test
ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000
Coefficients Standard Error t Stat P-value Lower 95% Upper 95%
Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580
F-Test for Significance
H0: β1 = 0 H0: R2 = 0 Test Statistic:
H1: β1 ≠ 0 H1: R2 > 0 MSR
F 11.08
= .05 MSE
df1= 1 df2 = 8
Decision:
Critical Reject H0 at = 0.05
Value:
F = 5.32
= .05 Conclusion:
There is sufficient evidence that
0 F house size affects selling price
Do not Reject H0
reject H0
F.05 = 5.32
F.05 = FINV(0.05, 1, 8) = 5.32
Prediction
The regression equation can be used to
predict a value for y, given a particular x
For a specified value, xn+1 , the predicted
value is
yˆ n1 b0 b1x n1
Predictions Using
Regression Analysis
Predict the price for a house
with 2000 square feet:
house price 98.25 0.1098 (sq.ft.)
98.25 0.1098(2000)
317.85
The predicted price for a house with 2000
square feet is 317.85($1,000s) = $317,850
Relevant Data Range
When using a regression model for prediction,
only predict within the relevant range of data
Relevant data range
450
400
House Price ($1000s)
350
300
250
200
150 Risky to try to
100
extrapolate far
50
0
beyond the range
0 500 1000 1500 2000 2500 3000 of observed X’s
Square Feet
Confidence Interval for
the Average Y, Given X
Confidence interval estimate for the
expected value of y given a particular xi
Confidence interval for E(Yn1 | Xn1 ) :
1 (x n1 x)2
yˆ n1 t n2,α/2se 2
n (x i x)
Notice that the formula involves the term (x n 1 x)
2
so the size of interval varies according to the distance
xn+1 is from the mean, x
Prediction Interval for
an Individual Y, Given X
Confidence interval estimate for an actual
observed value of y given a particular xi
Confidence interval for yˆ n1 :
1 (x n1 x)2
yˆ n1 t n2,α/2 se 1 2
n (x i x)
This extra term adds to the interval width to reflect
the added uncertainty for an individual case
Estimating Mean Values and
Predicting Individual Values
Goal: Form intervals around y to express
uncertainty about the value of y for a given xi
Confidence
Interval for
the expected
Y
y
value of y,
given xi
y = b0+b1xi
Prediction Interval
for an single
observed y, given xi
xi X
Estimation of Mean Values:
Example
Confidence Interval Estimate for E(Yn+1|Xn+1)
Find the 95% confidence interval for the mean price
of 2,000 square-foot houses
Predicted Price yi = 317.85 ($1,000s)
1 (x i x)2
yˆ n1 t n-2,α/2 se 317.85 37.12
n (xi x) 2
The confidence interval endpoints are 280.66 and 354.90,
or from $280,660 to $354,900
Estimation of Individual Values:
Example
Confidence Interval Estimate for yn+1
Find the 95% confidence interval for an individual
house with 2,000 square feet
Predicted Price yi = 317.85 ($1,000s)
1 (Xi X)2
yˆ n1 t n-1,α/2se 1 317.85 102.28
n (Xi X) 2
The confidence interval endpoints are 215.50 and
420.07, or from $215,500 to $420,070
Finding Confidence and
Prediction Intervals in Excel
Input values
y
Confidence Interval Estimate
for E(Yn+1|Xn+1)
Confidence Interval Estimate
for individual yn+1
Extrapolation outside the range of X
Predictions from a regression model are stronger within
the range of sample x values.
The relationship seen in the scatter plot may not be true
for values far outside observed x range.
Extrapolation outside the observed range of x is always
tempting but should be approached with caution.
Summary
Introduced the linear regression model
Reviewed correlation and the assumptions of
linear regression
Discussed estimating the simple linear
regression coefficients
Described measures of variation
Described inference about the slope
Addressed estimation of mean values and
prediction of individual values