12.1correlation and simple linear
12.1correlation and simple linear
Correlation is a statistical
technique used to determine the
degree to which two variables are
related
12 March 2025 1
Scatter diagram
• Rectangular coordinate
• Two quantitative variables
• One variable is called independent (X)
and the second is called dependent (Y)
• Points are not joined
• No frequency table Y
* *
*
X
12 March 2025 2
Example
Wt. 67 69 85 83 74 81 97 92 114 85
(kg)
SBP 120 125 140 160 130 180 150 140 200 130
(mmHg)
12 March 2025 3
SBP(mmHg)
Wt. 67 69 85 83 74 81 97 92 114 85
(kg)
220 SBP 120 125 140 160 130 180 150 140 200 130
(mmHg)
200
180
160
140
120
100
80 wt (kg)
60 70 80 90 100 110 120
200
180
160
140
120
100
80
Wt (kg)
60 70 80 90 100 110 120
12 March 2025 6
Positive relationship
12 March 2025 7
18
16
14
12
Height in CM
10
0
0 10 20 30 40 50 60 70 80 90
9
Reliability
Age of Car
12 March 2025
No relation
12 March 2025 10
Correlation Coefficient
12 March 2025 11
Simple Correlation coefficient (r)
12 March 2025 12
The sign of r denotes the nature
of association
12 March 2025 13
If the sign is +ve this means the relation
is direct (an increase in one variable is
associated with an increase in the
other variable and a decrease in one
variable is associated with a
decrease in the other variable).
12 March 2025 14
How to compute the simple correlation coefficient (r)
xy x y
r n
( x)2
( y)
2
x
2 . y
2
n n
12 March 2025 15
Example:
16
Weight Age serial
(Kg) (years) No
12 7 1
8 6 2
12 8 3
10 5 4
11 6 5
12 March 2025
13 9 6
These 2 variables are of the quantitative type, one
variable (Age) is called the independent and
denoted as (X) variable and the other (weight)
is called the dependent and denoted as (Y)
variables to find the relation between age and
weight compute the simple correlation coefficient
using the following formula:
xy x y
r n
( x) 2 ( y)2
x
2 . y
2
n n
12 March 2025 17
Weight Age
Serial
Y2 X2 xy (Kg) (years)
.n
(y) (x)
144 49 84 12 7 1
64 36 48 8 6 2
144 64 96 12 8 3
100 25 50 10 5 4
121 36 66 11 6 5
169 81 117 13 9 6
12 March 2025 18
41 66
461
r 6
(41) 2 (66) 2
291 . 742
6 6
r = 0.759
strong direct correlation
12 March 2025 19
EXAMPLE: Relationship between Anxiety and
Test Scores
Anxiety Test X2 Y2 XY
)X( score (Y)
20
10 2 100 4 20
8 3 64 9 24
2 9 4 81 18
1 7 1 49 7
5 6 25 36 30
6 5 36 25 30
X = 32∑ Y = 32∑ X2 = 230∑ Y2 = 204∑ XY=129∑
12 March 2025
Calculating Correlation Coefficient
r = - 0.94
12 March 2025 21
Spearman Rank Correlation Coefficient (rs)
12 March 2025 22
Procedure:
12 March 2025 23
5. Apply the following formula
6 (di) 2
rs 1
n(n 2 1)
12 March 2025 24
Example
In a study of the relationship between level injury
and income the following data was obtained. Find
the relationship between them and comment.
25
Income level of injury sample
(Y) (X) numbers
25 moderate. A
10 mild. B
8 fatal. C
10 Sever. D
15 Sever. E
50 Normal F
60 fatal. G
12 March 2025
Answer:
di2 di Rank Rank
Y X (Y) (X)
4 2 3 5 25 moderate. A
26
0.25 0.5 5.5 6 10 mild. B
30.25 -5.5 7 1.5 8 fatal. C
4 -2 5.5 3.5 10 Sever. D
0.25 -0.5 4 3.5 15 Sever. E
25 5 2 7 50 Normal F
0.25 0.5 1 1.5 60 fatal. G
∑ di2=64
12 March 2025
6 64
rs 1 0.1
7(48)
Comment:
There is an indirect weak correlation
between level of injury and income.
12 March 2025 27
exercise
12 March 2025 28
What is regression analysis?
• An extension of correlation
• A way of measuring the relationship
between two or more variables.
• Used to calculate the extent to which one
variable changes (DV) when other
variable(s) change (IV(s)).
• Used to help understand possible causal
effects of one variable on another.
12 March 2025 29
What is linear regression (LR)?
• Involves:
– one predictor (IV) and
– one outcome (DV)
• Explains a relationship using a straight line
fit to the data.
12 March 2025 30
Least squares criterion
12 March 2025 31
Least-Squares Regression
The most common method for fitting a
regression line is the method of least-
squares.
This method calculates the best-fitting line for
the observed data by minimizing the sum of the
squares of the vertical deviations from each
data point to the line (if a point lies on the fitted
line exactly, then its vertical deviation is 0).
Because the deviations are first squared, then
summed, there are no cancellations between
positive and negative values.
12 March 2025 32
Linear Regression - Model
Y
? (the actual value of Yi)
Yi b0 +
Y= bX1
ei
Yi X i i Population
Y ˆ= b0 + b1Xi + e
Sample
Yˆ = b0 + b1Xi
12 March 2025 34
Simple Linear Regression Model
• The population simple linear regression model:
y= a + b x + my|x=a+b x
35
or
Nonrandom or Random
Systematic Component
Component
• Where
• y is the dependent (response) variable, the variable we wish to explain or
predict;
• x is the independent (explanatory) variable, also called the predictor variable;
and
• is the error term, the only random component in the model, and thus, the only
source of randomness in y.
12 March 2025
Cont…
• my|x is the mean of y when x is specified,
all called the conditional mean of Y.
12 March 2025 36
. reg father son
38
expected or average value of Y,
• the dependent variable Y, and X,
my|x=a + x
the independent or predictor
variable:
{
y
{
an unexplained or random
error(e):
a = Intercept
y = my|x +
X = a+b x +
0 x
12 March 2025
Assumptions of the Simple Linear
Regression Model
• The relationship between X and Y LINE assumptions of the Simple Linear
is a straight-Line (linear) Y
Regression Model
relationship.
39
• The values of the independent
variable X are assumed fixed (not
random); the only randomness in my|x=a + x
the values of Y comes from the
error term .
• The errors are uncorrelated (i.e. y
Independent) in successive
observations. The errors are
Normally distributed with mean Identical normal
0 and variance 2(Equal distributions of errors, all
centered on the
variance). That is: ~ N(0,2) N(my|x, sy|x2) regression line.
X
x
12 March 2025
Fitting a Regression Line
Y Y
40
Data
Three errors from the
least squares regression
X line X
Y e
41
yi . yˆ a bx the fitted regression line
yˆi
{
Error ei yi yˆi
yˆ the predicted value of Y for x
X
xi
12 March 2025
Sums of Squares, Cross Products, and Least
Squares Estimators
Sums of Squares andCross Products:
(å x)
2
lxx = å (x x ) å x
- 2
= 2
-
n 2
lyy = å (y - y)2 = å y2 -
(å y)
n
(å x)(å y)
ŷ a =
lxy bx å (x - x)(y - y) = å xy -
n
Least -squares regressionestimators:
lxy
b= lxx
ŷ a bx
a = y - bx
12 March 2025 42
Example
x2 y2
x
Patient x y x ×y 2
592.62
1 22. 4 134. 0 501. 76 17956. 0 3001. 60 lxx x 2
41222.14 6104.66
4 25. 1 80. 2 630. 01 6432. 0 2013. 02 n 10
8 32. 4 97. 2 1049. 76 9447. 8 3149. 28
y
2
2 51. 6 167. 0 2662. 56 27889. 0 8617. 20 1428.702
3 58. 1 132. 3 3375. 61 17503. 3 7686. 63 l yy y 2
220360.47 16242.10
5 65. 9 100. 0 4342. 81 10000. 0 6590. 00 n 10
7
6
75. 3 187. 2
79. 7 139. 1
5670. 09
6352. 09
35043. 8
19348. 8
14096. 16
11086. 27 lxy xy
x y 91866.46 592.6 1428.70 7201.70
10 85. 7 199. 4 7344. 49 39760. 4 17088. 58 n 10
9 96. 4 192. 3 9292. 96 36979. 3 18537. 72
Total 592. 6 1428. 7 41222. 14 220360. 5 91866. 46 7201.70
lxy
b 1.18
l 6104.66
xx
12 March 2025 43
Linear Regression - Variation
SSR
Due to regression.
SST
Random/unexplained.
12 March 2025 44
Linear Regression - Variation
Y
SSE =(Yi - Yi
_ )2
SST = (Yi -
Y)2
_
SSR = (Yi - Y)2
_
Y
Xi X
12 March 2025 45