0% found this document useful (0 votes)
7 views45 pages

12.1correlation and simple linear

The document discusses correlation as a statistical technique to determine the relationship between two quantitative variables, emphasizing that correlation does not imply causation. It explains the use of scatter diagrams, correlation coefficients, and provides examples of calculating correlation, including Pearson's and Spearman's methods. Additionally, it introduces regression analysis as an extension of correlation to measure relationships and predict outcomes between variables.

Uploaded by

Berhanu Yelea
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views45 pages

12.1correlation and simple linear

The document discusses correlation as a statistical technique to determine the relationship between two quantitative variables, emphasizing that correlation does not imply causation. It explains the use of scatter diagrams, correlation coefficients, and provides examples of calculating correlation, including Pearson's and Spearman's methods. Additionally, it introduces regression analysis as an extension of correlation to measure relationships and predict outcomes between variables.

Uploaded by

Berhanu Yelea
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 45

Correlation

Finding the relationship between


two quantitative variables
without being able to infer causal
relationships

Correlation is a statistical
technique used to determine the
degree to which two variables are
related
12 March 2025 1
Scatter diagram
• Rectangular coordinate
• Two quantitative variables
• One variable is called independent (X)
and the second is called dependent (Y)
• Points are not joined
• No frequency table Y
* *
*
X
12 March 2025 2
Example

Wt. 67 69 85 83 74 81 97 92 114 85
(kg)
SBP 120 125 140 160 130 180 150 140 200 130
(mmHg)

12 March 2025 3
SBP(mmHg)
Wt. 67 69 85 83 74 81 97 92 114 85
(kg)
220 SBP 120 125 140 160 130 180 150 140 200 130
(mmHg)
200
180
160
140
120
100
80 wt (kg)
60 70 80 90 100 110 120

Scatter diagram of weight and systolic blood


pressure
12 March 2025 4
SBP (mmHg)
220

200

180

160

140

120

100

80
Wt (kg)
60 70 80 90 100 110 120

Scatter diagram of weight and systolic blood


pressure
12 March 2025 5
Scatter plots

The pattern of data is indicative of the type


of relationship between your two
variables:
 positive relationship
 negative relationship
 no relationship

12 March 2025 6
Positive relationship

12 March 2025 7
18

16

14

12
Height in CM

10

0
0 10 20 30 40 50 60 70 80 90

12 March 2025 Age in Weeks 8


Negative relationship

9
Reliability

Age of Car
12 March 2025
No relation

12 March 2025 10
Correlation Coefficient

Statistic showing the degree of relation


between two variables

12 March 2025 11
Simple Correlation coefficient (r)

 It is also called Pearson's correlation


or product moment correlation
coefficient.
 It measures the nature and strength
between two variables of
the quantitative type.

12 March 2025 12
The sign of r denotes the nature
of association

while the value of r denotes the


strength of association.

12 March 2025 13
 If the sign is +ve this means the relation
is direct (an increase in one variable is
associated with an increase in the
other variable and a decrease in one
variable is associated with a
decrease in the other variable).

 While if the sign is -ve this means an


inverse or indirect relationship (which
means an increase in one variable is
associated with a decrease in the other).

12 March 2025 14
How to compute the simple correlation coefficient (r)

 xy   x y
r n
 ( x)2
 ( y) 
2
x 
2 .  y 
2 
 n  n 
  

12 March 2025 15
Example:

A sample of 6 children was selected, data about their


age in years and weight in kilograms was recorded as
shown in the following table . It is required to find the
correlation between age and weight.

16
Weight Age serial
(Kg) (years) No
12 7 1
8 6 2
12 8 3
10 5 4
11 6 5

12 March 2025
13 9 6
These 2 variables are of the quantitative type, one
variable (Age) is called the independent and
denoted as (X) variable and the other (weight)
is called the dependent and denoted as (Y)
variables to find the relation between age and
weight compute the simple correlation coefficient
using the following formula:

 xy   x y
r  n
 ( x) 2  ( y)2 
x 
2 .  y 
2 
 n  n 
  

12 March 2025 17
Weight Age
Serial
Y2 X2 xy (Kg) (years)
.n
(y) (x)
144 49 84 12 7 1

64 36 48 8 6 2

144 64 96 12 8 3

100 25 50 10 5 4

121 36 66 11 6 5

169 81 117 13 9 6

=y2∑ =x2∑ xy=∑ =y ∑ =x ∑ Total


742 291 461 66 41

12 March 2025 18
41 66
461 
r 6
 (41) 2   (66) 2 
 291   . 742  
 6  6 

r = 0.759
strong direct correlation

12 March 2025 19
EXAMPLE: Relationship between Anxiety and
Test Scores
Anxiety Test X2 Y2 XY
)X( score (Y)

20
10 2 100 4 20
8 3 64 9 24
2 9 4 81 18
1 7 1 49 7
5 6 25 36 30
6 5 36 25 30
X = 32∑ Y = 32∑ X2 = 230∑ Y2 = 204∑ XY=129∑
12 March 2025
Calculating Correlation Coefficient

(6)(129)  (32)(32) 774  1024


r   .94
6(230)  32 6(204)  32 
2 2
(356)( 200)

r = - 0.94

Indirect strong correlation

12 March 2025 21
Spearman Rank Correlation Coefficient (rs)

It is a non-parametric measure of correlation.


This procedure makes use of the two sets of
ranks that may be assigned to the sample
values of x and Y.
Spearman Rank correlation coefficient could be
computed in the following cases:
Both variables are quantitative.
Both variables are qualitative ordinal.
One variable is quantitative and the other is
qualitative ordinal.

12 March 2025 22
Procedure:

1. Rank the values of X from 1 to n where n


is the numbers of pairs of values of X and
Y in the sample.
2. Rank the values of Y from 1 to n.
3. Compute the value of di for each pair of
observation by subtracting the rank of Yi
from the rank of Xi
4. Square each di and compute ∑di2 which
is the sum of the squared values.

12 March 2025 23
5. Apply the following formula

6 (di) 2
rs 1 
n(n 2  1)

The value of rs denotes the magnitude and


nature of association giving the same
interpretation as simple r.

12 March 2025 24
Example
In a study of the relationship between level injury
and income the following data was obtained. Find
the relationship between them and comment.

25
Income level of injury sample
(Y) (X) numbers
25 moderate. A
10 mild. B
8 fatal. C
10 Sever. D
15 Sever. E
50 Normal F
60 fatal. G
12 March 2025
Answer:
di2 di Rank Rank
Y X (Y) (X)
4 2 3 5 25 moderate. A

26
0.25 0.5 5.5 6 10 mild. B
30.25 -5.5 7 1.5 8 fatal. C
4 -2 5.5 3.5 10 Sever. D
0.25 -0.5 4 3.5 15 Sever. E
25 5 2 7 50 Normal F
0.25 0.5 1 1.5 60 fatal. G

∑ di2=64

12 March 2025
6 64
rs 1   0.1
7(48)

Comment:
There is an indirect weak correlation
between level of injury and income.

12 March 2025 27
exercise

12 March 2025 28
What is regression analysis?
• An extension of correlation
• A way of measuring the relationship
between two or more variables.
• Used to calculate the extent to which one
variable changes (DV) when other
variable(s) change (IV(s)).
• Used to help understand possible causal
effects of one variable on another.

12 March 2025 29
What is linear regression (LR)?
• Involves:
– one predictor (IV) and
– one outcome (DV)
• Explains a relationship using a straight line
fit to the data.

12 March 2025 30
Least squares criterion

12 March 2025 31
Least-Squares Regression
The most common method for fitting a
regression line is the method of least-
squares.
 This method calculates the best-fitting line for
the observed data by minimizing the sum of the
squares of the vertical deviations from each
data point to the line (if a point lies on the fitted
line exactly, then its vertical deviation is 0).
 Because the deviations are first squared, then
summed, there are no cancellations between
positive and negative values.
12 March 2025 32
Linear Regression - Model
Y
? (the actual value of Yi)
Yi b0 +
Y= bX1
ei

12 March 2025 Xi X33


Linear Regression - Model

Yi     X i   i Population

Regression Coefficients for a . . .

Y ˆ= b0 + b1Xi + e
Sample

Yˆ = b0 + b1Xi

12 March 2025 34
Simple Linear Regression Model
• The population simple linear regression model:

y= a + b x +  my|x=a+b x

35
or
Nonrandom or Random
Systematic Component
Component
• Where
• y is the dependent (response) variable, the variable we wish to explain or
predict;
• x is the independent (explanatory) variable, also called the predictor variable;
and
•  is the error term, the only random component in the model, and thus, the only
source of randomness in y.

12 March 2025
Cont…
• my|x is the mean of y when x is specified,
all called the conditional mean of Y.

• a is the intercept of the systematic


component of the regression relationship.
•  is the slope of the systematic
component.

12 March 2025 36
. reg father son

Source SS df MS Number of obs = 12


F(1, 10) = 9.75
Model 41.8015703 1 41.8015703 Prob > F = 0.0108
Residual 42.8650964 10 4.28650964 R-squared = 0.4937
Adj R-squared = 0.4431
Total 84.6666667 11 7.6969697 Root MSE = 2.0704

father Coef. Std. Err. t P>|t| [95% Conf. Interval]

son 1.036403 .3318823 3.12 0.011 .2969227 1.775882


_cons -3.376874 22.43767 -0.15 0.883 -53.37113 46.61738
Picturing the Simple Linear Regression Model
Regression Plot • The simple linear regression
Y
model posits an exact linear
relationship between the

38
expected or average value of Y,
• the dependent variable Y, and X,
my|x=a +  x
the independent or predictor
variable:

{
y

Error:  }  = Slope my|x= a+b x


}

Actual observed values of Y (y) differ


1
from the expected value (my|x ) by

{
an unexplained or random
error(e):
a = Intercept
y = my|x + 
X = a+b x + 
0 x

12 March 2025
Assumptions of the Simple Linear
Regression Model
• The relationship between X and Y LINE assumptions of the Simple Linear
is a straight-Line (linear) Y
Regression Model
relationship.

39
• The values of the independent
variable X are assumed fixed (not
random); the only randomness in my|x=a +  x
the values of Y comes from the
error term .
• The errors  are uncorrelated (i.e. y
Independent) in successive
observations. The errors  are
Normally distributed with mean Identical normal
0 and variance 2(Equal distributions of errors, all
centered on the
variance). That is: ~ N(0,2) N(my|x, sy|x2) regression line.

X
x
12 March 2025
Fitting a Regression Line
Y Y

40
Data
Three errors from the
least squares regression
X line X
Y e

Three errors Errors from the least


from a fitted line squares regression
line are minimized
X X
12 March 2025
Errors in Regression
Y

41
yi . yˆ a  bx the fitted regression line

yˆi
{
Error ei yi  yˆi
yˆ the predicted value of Y for x

X
xi
12 March 2025
Sums of Squares, Cross Products, and Least
Squares Estimators
Sums of Squares andCross Products:
(å x)
2

lxx = å (x x ) å x
- 2
= 2
-
n 2
lyy = å (y - y)2 = å y2 -
(å y)
n
(å x)(å y)
ŷ a =
lxy bx å (x - x)(y - y) = å xy -
n
Least -squares regressionestimators:
lxy
b= lxx
ŷ a  bx
a = y - bx
12 March 2025 42
Example
x2 y2
 x 
Patient x y x ×y 2
592.62
1 22. 4 134. 0 501. 76 17956. 0 3001. 60 lxx  x 2
 41222.14  6104.66
4 25. 1 80. 2 630. 01 6432. 0 2013. 02 n 10
8 32. 4 97. 2 1049. 76 9447. 8 3149. 28
 y 
2
2 51. 6 167. 0 2662. 56 27889. 0 8617. 20 1428.702
3 58. 1 132. 3 3375. 61 17503. 3 7686. 63 l yy  y 2
 220360.47  16242.10
5 65. 9 100. 0 4342. 81 10000. 0 6590. 00 n 10
7
6
75. 3 187. 2
79. 7 139. 1
5670. 09
6352. 09
35043. 8
19348. 8
14096. 16
11086. 27 lxy  xy 
 x y 91866.46  592.6 1428.70 7201.70
10 85. 7 199. 4 7344. 49 39760. 4 17088. 58 n 10
9 96. 4 192. 3 9292. 96 36979. 3 18537. 72
Total 592. 6 1428. 7 41222. 14 220360. 5 91866. 46 7201.70
lxy
b  1.18
l 6104.66
xx

regression equation: a y  bx 1428.7  (1.18)  592.6 


10
 
 10 
yˆ 72.96  1.18 x 72.96

12 March 2025 43
Linear Regression - Variation

SSR

Due to regression.

SST

SST = SSR + SSE SSE

Random/unexplained.

12 March 2025 44
Linear Regression - Variation
Y 
SSE =(Yi - Yi
_ )2
SST = (Yi -
Y)2
 _
SSR = (Yi - Y)2
_
Y

Xi X
12 March 2025 45

You might also like