0% found this document useful (0 votes)
21 views

Simplelinearregression NBC

Simple linear regression analyzes the relationship between a dependent variable (y) and a single independent variable (x). It finds the best fitting straight line through the data points to represent the relationship. The line is defined by the slope (b1), which represents the change in y with a one-unit change in x, and the y-intercept (b0). Regression was used to analyze the relationship between house prices (y) and square footage (x). It found a linear relationship defined by the equation: price = 98.25 + 0.1098 * sqft. This model can be used to predict house prices within the range of square footages in the original data.

Uploaded by

8mk5n2q8pt
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views

Simplelinearregression NBC

Simple linear regression analyzes the relationship between a dependent variable (y) and a single independent variable (x). It finds the best fitting straight line through the data points to represent the relationship. The line is defined by the slope (b1), which represents the change in y with a one-unit change in x, and the y-intercept (b0). Regression was used to analyze the relationship between house prices (y) and square footage (x). It found a linear relationship defined by the equation: price = 98.25 + 0.1098 * sqft. This model can be used to predict house prices within the range of square footages in the original data.

Uploaded by

8mk5n2q8pt
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 50

Simple Linear

Regression

Slide-1
Correlation vs. Regression
▪ A scatter diagram can be used to show the
relationship between two variables
▪ Correlation analysis is used to measure
strength of the association (linear relationship)
between two variables
▪ Correlation is only concerned with strength of the
relationship
▪ No causal effect is implied with correlation

Slide-2
What Is Regression?

• Regression searches for relationships among variables.

• For example, you can observe several employees of some company


and try to understand how their salaries depend on their features, such
as experience, education level, role, city of employment, and so on.
• This is a regression problem where data related to each employee
represents one observation.
• The presumption is that the experience, education, role, and city are
the independent features, while the salary depends on them.
• Similarly, you can try to establish the mathematical dependence of
housing prices on area, number of bedrooms, distance to the city
center, and so on.

Slide-3
Introduction to
Regression Analysis
▪ Regression analysis is used to:
▪ Predict the value of a dependent variable based on the
value of at least one independent variable
▪ Explain the impact of changes in an independent
variable on the dependent variable

Dependent variable: the variable we wish to


predict or explain
Independent variable: the variable used to explain
the dependent variable

Slide-4
Simple Linear Regression
Model
▪ Only one independent variable, X
▪ Relationship between X and Y
is described by a linear function
▪ Changes in Y are assumed to be
caused by changes in X

Slide-5
Types of Relationships
Linear relationships Curvilinear relationships

Y Y

X X

Y Y

X X
Slide-6
Types of Relationships
Strong relationships Weak relationships

Y Y

X X

Y Y

X X
Slide-7
Types of Relationships
No relationship

X
Slide-8
Slide-9
Simple Linear Regression
Model
Simple Linear Regression
Model
(continued)

Slide-11
Simple Linear Regression
Equation (Prediction Line)

Slide-12
Least Squares Method

Slide-13
Finding the Least Squares
Equation

▪ The coefficients b and b , and other


0 1
regression results in this section, will
be found using Excel or SPSS

Formulas are shown in the text for those


who are interested

Slide-14
Interpretation of the
Slope and the Intercept

▪ b is the estimated average value of Y


0

when the value of X is zero

▪ b is the estimated change in the


1

average value of Y as a result of


a one-unit change in X

Slide-15
Slide-1
The least squares line has two components: the slope m, and y-intercept b. We will solve for m first,
and then solve for b. The equations for m and b are:

Slide-1
Slide-1
Slide-1
Simple Linear Regression
Example
▪ A real estate agent wishes to examine the
relationship between the selling price of a home
and its size (measured in square feet)

▪ A random sample of 10 houses is selected


▪ Dependent variable (Y) = house price in $1000s
▪ Independent variable (X) = square feet

Slide-20
Sample Data for House Price
Model
House Price in $1000s Square Feet
(Y) (X)
245 1400
312 1600
279 1700
308 1875
199 1100
219 1550
405 2350
324 2450
319 1425
255 1700

Slide-21
Graphical Presentation

▪ House price model: scatter


plot
450
400
House Price ($1000s)

350
300
250
200
150
100
50
0
0 500 10001500200025003000
Square Feet

Slide-22
Regression Using Excel
▪ Tools / Data Analysis / Regression

Slide-23
Slide-2
Excel Output
Regression Statistics
Multiple R
0.76211 The regression equation is:
R Square
0.58082
Adjusted R Square 0.52842 house price = 98.24833 + 0.10977 (square
Standard Error
41.33032 feet)
Observations
10

ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Slide-25
Graphical Presentation

▪ House price model: scatter plot


regression
and 450
line
400
House Price ($1000s)

350
Slope
300
250
= 0.10977
200
150
100
50
Intercept 0
= 98.248 0 500 1000 1500 2000 2500 3000
Square Feet

house price = 98.24833 + 0.10977 (square feet)

Slide-26
Interpretation of the
Intercept, b0

house price = 98.24833


+ 0.10977 (square feet)

▪ b0 is the estimated average value of Y when the


value of X is zero (if X = 0 is in the range of
observed X values)

▪ Here, no houses had 0 square feet, so b 0


= 98.24833

just indicates that, for houses within the range of


sizes observed, $98,248.33 is the portion of the
house price not explained by square feet

Slide-27
Interpretation of the
Slope Coefficient, b1

house price = 98.24833 +0.10977


(square feet)

▪ b1 measures the estimated change in the


average value of Y as a result of a one-
unit change in X
▪ Here, b = .10977 tells us that the average value of a
1

house increases by .10977($1000) = $109.77, on


average, for each additional one square foot of size

Slide-28
Predictions using
Regression Analysis
Predict the price for a house
with 2000 square feet:

house price = 98.25 + 0.1098


(sq.ft.)

= 98.25 + 0.1098(2000)

= 317.85
The predicted price for a house with 2000
square feet is 317.85($1,000s) = $317,850
Slide-29
Interpolation vs. Extrapolation
▪ When using a regression model for prediction,
only predict within the relevant range of data
Relevant range for
interpolation

450
400
House Price ($1000s)

350
300
250
200
150 Do not try to
100
extrapolate
50
0 beyond the range
0 500 1000 1500 2000 2500 3000 of observed X’s
Square Feet
Department of Statistics, ITS Slide-24
Slide-3
Measures of Variation

▪ Total variation is made up of two parts:


SST = SSR+ SSE
Total Sum of Regression Sum Error Sum of
Squares of Squares Squares

SST = ∑ (Yi − Y) 2 i − Y) 2
SSE = ∑ (Y
i −
i
ˆ )2
Y

SSR =
where:
ˆ
=(Y
Average value of the dependent variable
Y
Yi = Observed values of the dependent variable
ˆ = Predicted value of Y for the given X value
Yi i
Slide-31
Measures of Variation
(continued)

▪ SST = total sum of squares

▪Measures the variation of the Y values around their


i

mean Y
▪ SSR = regression sum of squares
▪ Explained variation attributable to the relationship
between X and Y
▪ SSE = error sum of squares
▪ Variation attributable to factors other than the
relationship between X and Y

Slide-32
Measures of Variation
(continued)
Y
Yi ∧ ∧
SSE = ∑ (Yi - Yi Y
_ )2
SST = ∑(Yi - Y)2

Y ∧ _
_ SSR = ∑ (Yi - Y)2
_
Y Y

Xi X
Slide-33
Coefficient of Determination, r2
▪ The coefficient of determination is the portion
of the total variation in the dependent variable
that is explained by variation in the
independent variable
▪ The coefficient of determination is also called
r-squared and is denoted as r2

SSR regression sum of squares


r =
2 =
SST total sum of squares

note: 0≤r2 ≤
1 Slide-34
Examples of Approximate
2 Values
r
Y
r2 = 1

Perfect linear relationship


between X and Y:
X
r2 = 1
Y 100% of the variation in Y is
explained by variation in X

2
X
r =1
Slide-35
Examples of Approximate
2 Values
r
Y
0 < r2 < 1

Weaker linear relationships


between X and Y:
X
Some but not all of the
Y
variation in Y is explained
by variation in X

X
Slide-36
Examples of Approximate
2 Values
r

r2 = 0
Y
No linear relationship
between X and Y:

The value of Y does not


r2 = 0 X depend on X. (None of
the variation in Y is
explained by variation in
X)

Slide-37
Excel Output
Regression Statistics SSR 18934.9348
Multiple R 0.76211 r =
2 = = 0.58082
R Square 0.58082 SST 32600.5000
Adjusted R Square 0.52842 58.08% of the variation in
Standard Error 41.33032
house prices is explained by
Observations 10
variation in square feet
ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Slide-38
Extra

Slide-3
Standard Error of Estimate
▪ The standard deviation of the variation of
observations around the regression line is
estimated by
n

SSE ∑ (Yi − iY )
ˆ 2

i=1
SYX =
= n− n−2
Where 2
SSE = error sum of
squares n = sample
size
Slide-40
Excel Output
Regression Statistics
Multiple R
R Square
0.76211 SYX =
0.58082
Adjusted R Square 0.52842 41.33032
Standard Error 41.33032
Observations 10

ANOVA
df SS MS F Significance F
Regression 1 18934.9348 18934.9348 11.0848 0.01039
Residual 8 13665.5652 1708.1957
Total 9 32600.5000

Coefficients Standard Error t Stat P-value Lower 95% Upper 95%


Intercept 98.24833 58.03348 1.69296 0.12892 -35.57720 232.07386
Square Feet 0.10977 0.03297 3.32938 0.01039 0.03374 0.18580

Slide-41
Comparing Standard Errors
SYX is a measure of the variation of observed
Y values from the regression line

Y Y

Xs
small sYX large X
YX

The magnitude of SYX should always be judged relative to the


size of the Y values in the sample data
i.e., SYX = $41.33K is moderately small relative to house prices in
the $200 - $300K range
Slide-42
Assumptions of Regression
Use the acronym LINE:
▪ Linearity
▪ The underlying relationship between X and Y is linear

▪ Independence of Errors
▪ Error values are statistically independent

▪ Normality of Error
▪ Error values (ε) are normally distributed for any given value of
X

▪ Equal Variance (Homoscedasticity)


▪ The probability distribution of the errors has constant variance

Slide-43
Residual
Analysis
ˆ
ei = Y i − Y i

▪ The residual for observation i, e , is the difference


i

between its observed and predicted value


▪ Check the assumptions of regression by examining the
residuals
▪ Examine for linearity assumption
▪ Evaluate independence assumption
▪ Evaluate normal distribution assumption
▪ Examine for constant variance for all levels of X
(homoscedasticity)

▪ Graphical Analysis of Residuals Slide-44


Residual Analysis for Linearity

Y Y

x x
residuals

x residuals x

Not Linear
✔ Linear
Slide-45
Residual Analysis for
Independence

Not Independent
✔ Independent
residuals

residuals
X
residuals

Slide-46
Residual Analysis for Normality

▪A normal probability plot of the residuals can


be used to check for normality:

Percent
100

0
-3 -2 -1 0 1 2 3
Residual
Slide-47
Residual Analysis for
Equal Variance
Y Y

x x
residuals

x residuals x

Non-constant variance ✔ Constant variance

Slide-48
Excel Residual Output

RESIDUAL OUTPUT House Price Model Residual Plot


Predicted
House Price Residuals 80
1 251.92316 -6.923162
60
2 273.87671 38.12329
40
3 284.85348 -5.853484 Residuals
20
4 304.06284 3.937162
5 218.99284 -19.99284 0
0 1000 2000 3000
6 268.38832 -49.38832 -20
7 356.20251 48.79749 -40
8 367.17929 -43.17929 -60
9 254.6674 64.33264 Square Feet
10 284.85348 -29.85348
Does not appear to violate
any regression assumptions
Slide-49
Measuring Autocorrelation:
The Durbin-Watson Statistic

▪ Used when data are collected over time to


detect if autocorrelation is present
▪ Autocorrelation exists if residuals in one
time period are related to residuals in
another period

Slide-50

You might also like