0% found this document useful (0 votes)
18 views25 pages

4-Biol 605-Regression Models (1)

Uploaded by

bentisutume
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views25 pages

4-Biol 605-Regression Models (1)

Uploaded by

bentisutume
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

LINEAR REGRESSION

AND
CORRELATION
Correlation
Correlation
A correlation is a relationship between two variables. The
data can be represented by the ordered pairs (x, y) where
x is the independent (or explanatory) variable, and y is
the dependent (or response) variable.
A scatter plot can be used to y

determine whether a linear 2


(straight line) correlation exists
between two variables. x
Example: 2 4 6

x 1 2 3 4 5 –2

y –4 –2 –1 0 2
–4
Linear Correlation
y y
As x increases, As x increases,
y tends to y tends to
decrease. increase.

x x
Negative Linear Correlation Positive Linear Correlation
y y

x x
No Correlation Nonlinear Correlation
Correlation Coefficient
The correlation coefficient is a measure of the strength
and the direction of a linear relationship between two
variables. The symbol r represents the sample correlation
coefficient. The formula for r is
n  xy   x  y 
r .
n  x 2   x  n  y 2   y 
2 2

The range of the correlation coefficient is 1 to 1. If x and


y have a strong positive linear correlation, r is close to 1.
If x and y have a strong negative linear correlation, r is
close to 1. If there is no linear correlation or a weak
linear correlation, r is close to 0.
Linear Correlation
y
y

r = 0.91 r = 0.88

x
x
Strong negative correlation
Strong positive correlation
y
y
r = 0.42
r = 0.07

x
x
Weak positive correlation
Nonlinear Correlation
The correlation between X and Y may be:

Perfect positive ( =1)


Positive(between 0 and 1)

No Linear correlation (=0)

Negative( between -1 and 0)

Perfect negative( =-1)


Calculating a Correlation Coefficient

Steps to compute Correlation Coefficient


1. Find the sum of the x-values. x
2. Find the sum of the y-values. y
3. Multiply each x-value by its
corresponding y-value and find the  xy
sum.
x2
4. Square each x-value and find the sum.
y2
5. Square each y-value and find the sum.
6. Use these five sums to calculate n  xy   x  y 
r .
n  x   x  n  y   y 
2 2 2 2
the correlation coefficient.
Continued.
Correlation Coefficient
Example:
Calculate the correlation coefficient r for the following data.
x y xy x2 y2
1 –3 –3 1 9
2 –1 –2 4 1
3 0 0 9 0
4 1 4 16 1
5 2 10 25 4
 x  15  y  1  xy  9  x 2  55  y 2  15
n  xy   x  y  5(9)  15 1
r 
5(55)  152 5(15)  1
2
n  x 2   x  n  y 2   y 
2 2

There is a strong positive


60  0.986
 linear correlation between
50 74
x and y.
Correlation Coefficient
Example:
The following data represents the number of hours 12
different students watched television during the
weekend and the scores of each student who took a test
the following Monday.
a.) Display the scatter plot.
b.) Calculate the correlation coefficient r.

Hours, x 0 1 2 3 3 5 5 5 6 7 7 10
Test score, y 96 85 82 74 95 68 76 84 58 65 75 50

Continued.
Correlation Coefficient
Example continued:
Hours, x 0 1 2 3 3 5 5 5 6 7 7 10
Test score, y 96 85 82 74 95 68 76 84 58 65 75 50
y
100
80
Test score

60
40
20
x
2 4 6 8 10
Hours watching TV
Continued.
Correlation Coefficient
Example continued:
Hours, x 0 1 2 3 3 5 5 5 6 7 7 10
Test score, y 96 85 82 74 95 68 76 84 58 65 75 50
xy 0 85 164 222 285 340 380 420 348 455 525 500
x2 0 1 4 9 9 25 25 25 36 49 49 100
y2 9216 7225 6724 5476 9025 4624 5776 7056 3364 4225 5625 2500

 x  54  y  908  xy  3724  x 2  332  y 2  70836

n  xy   x  y  12(3724)  54 908


r   0.831
n  x   x  n  y   y 
2 2
12(70836)  908
2 2 2 2
12(332)  54

There is a strong negative linear correlation.


As the number of hours spent watching TV increases,
the test scores tend to decrease.
Correlation and Causation
The fact that two variables are strongly correlated does not in
itself imply a cause-and-effect relationship between the
variables.
If there is a significant correlation between two
variables, you should consider the following possibilities.
1. Is there a direct cause-and-effect relationship between the variables?
Does x cause y?
2. Is there a reverse cause-and-effect relationship between the variables?
Does y cause x?
3. Is it possible that the relationship between the variables can be
caused by a third variable or by a combination of several other
variables?
4. Is it possible that the relationship between two variables may be a
coincidence?
Linear Regression
Residuals
After verifying that the linear correlation between two
variables is significant, next we determine the equation of
the line that can be used to predict the value of y for a
given value of x.
Observed
y
y-value

d2 For a given x-value,


d1
d = (observed y-value) – (predicted y-value)

Predicted d
3
y-value
x
Each data point di represents the difference between the
observed y-value and the predicted y-value for a given x-
value on the line. These differences are called residuals.
Regression Line
A regression line, also called a line of best fit, is the line
for which the sum of the squares of the residuals is a
minimum.
The Equation of a Regression Line
The equation of a regression line for an independent variable
x and a dependent variable y is
ŷ = mx + b
where ŷ is the predicted y-value for a given x-value. The slope
m and y-intercept b are given by
n  xy   x  y  y x
m and b  y  mx   m
n  x 2   x 
2 n n
where y is the mean of the y - values and x is the mean of the
x - values. The regression line always passes through (x , y ).
Regression Line
Example:
Find the equation of the regression line.
x y xy x2 y2
1 –3 –3 1 9
2 –1 –2 4 1
3 0 0 9 0
4 1 4 16 1
5 2 10 25 4
 x  15  y  1  xy  9  x 2  55  y 2  15

n  xy   x  y  5(9)  151 60


m    1.2
n  x   x 
2 2
5(55)  15
2 50
Continued.
Regression Line
Example continued:
1 15
b  y  mx   (1.2)  3.8
5 5
The equation of the regression line is
ŷ = 1.2x – 3.8. y
2
1
x
1 2 3 4 5

 
1
1
2 (x , y )  3,
5
3
Regression Line
Example:
The following data represents the number of hours 12
different students watched television during the
weekend and the scores of each student who took a test
the following Monday.
a.) Find the equation of the regression line.
b.) Use the equation to find the expected test score
for a student who watches 9 hours of TV.
Hours, x 0 1 2 3 3 5 5 5 6 7 7 10
Test score, y 96 85 82 74 95 68 76 84 58 65 75 50
xy 0 85 164 222 285 340 380 420 348 455 525 500
x2 0 1 4 9 9 25 25 25 36 49 49 100
y2 9216 7225 6724 5476 9025 4624 5776 7056 3364 4225 5625 2500

 x  54  y  908  xy  3724  x 2  332  y 2  70836


Regression Line
Example continued:
n  xy   x  y  12(3724)  54908
m   4.067
n  x   x  12(332)  54
2 2 2

y
b  y  mx 100 (x , y )  1254 , 908
12 
 4.5,75.7
908 54
  (4.067) 80
12 12
Test score 60
 93.97
40

ŷ = –4.07x + 93.97 20
x
2 4 6 8 10
Hours watching TV
Continued.
Regression Line
Example continued:
Using the equation ŷ = –4.07x + 93.97, we can predict
the test score for a student who watches 9 hours of TV.

ŷ = –4.07x + 93.97
= –4.07(9) + 93.97
= 57.34

A student who watches 9 hours of TV over the weekend


can expect to receive about a 57.34 on Monday’s test.
Measures of Regression
Variation About a Regression Line
To find the total variation, you must first calculate the
total deviation, the explained deviation, and the
unexplained deviation.
Total deviation  y i  y
Explained deviation  yˆ i  y
Unexplained deviation  y i  yˆ i
y (x i , y i )
Unexplained
Total deviation
y i  yˆ i
deviation
yi  y
(x i , ŷ i ) Explained
y
deviation
(xi, yi) yˆ i  y

x
x
Variation About a Regression Line
The total variation about a regression line is the sum of the
squares of the differences between the y-value of each ordered
pair and the mean of y.
Total variation   y i  y 
2

The explained variation is the sum of the squares of the


differences between each predicted y-value and the mean of y.
Explained variation   yˆ i  y 
2

The unexplained variation is the sum of the squares of the


differences between the y-value of each ordered pair and each
corresponding predicted y-value.
Unexplained variation   y i  yˆ i 
2

Total variation  Explained variation  Unexplained variation


Coefficient of Determination
The coefficient of determination r2 is the ratio of the
explained variation to the total variation. That is,
r 2  Explained variation
Total variation

Example:
The correlation coefficient for the data that represents
the number of hours students watched television and the
test scores of each student is r  0.831. Find the
coefficient of determination.
r 2  (0.831)2 About 69.1% of the variation in the test
scores can be explained by the variation
 0.691
in the hours of TV watched. About 30.9%
of the variation is unexplained.

You might also like