0% found this document useful (0 votes)
20 views

Unit 5 (The Method of Least Squares)

The document discusses linear regression methods. It covers linear regression, polynomial regression, and multiple linear regression. Linear regression finds the best-fitting straight line through data points by minimizing the sum of squared residuals. Polynomial regression fits data to higher order polynomials. Multiple linear regression generalizes this to model systems with two or more explanatory variables. The document provides examples of setting up and solving the normal equations to perform simple linear regression. It also discusses quantifying the error in linear regression models.

Uploaded by

ebrahim5936
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views

Unit 5 (The Method of Least Squares)

The document discusses linear regression methods. It covers linear regression, polynomial regression, and multiple linear regression. Linear regression finds the best-fitting straight line through data points by minimizing the sum of squared residuals. Polynomial regression fits data to higher order polynomials. Multiple linear regression generalizes this to model systems with two or more explanatory variables. The document provides examples of setting up and solving the normal equations to perform simple linear regression. It also discusses quantifying the error in linear regression models.

Uploaded by

ebrahim5936
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 34

Numerical Methods (CISE-301)

Unit-5
(The Method of Least Squares)
Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Contents (Unit-5):
1) Linear Regression (Sec 17.1)

2) Polynomial Regression (Sec 17.2)

3) Multiple Linear Regression (Sec 17.3)

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Linear Regression (Sec 17.1)
Regression:

Data Set Regression Interpolation


Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Regression:
 Fits a selected function to the general trend of data.

 An underlying mathematical model is selected, based on physical situation.

 Coefficients of the model are determined such that the error between model values and
the given date is minimum.

 Applied when substantial error is associated with the data.

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Regression:
 Let the approximation function be → y = 𝑓𝑓 𝑥𝑥 = 𝑎𝑎0 𝑥𝑥 + 𝑎𝑎1

 Value of the 𝑘𝑘 𝑡𝑡𝑡 point can be obtained from → 𝑓𝑓(𝑥𝑥𝑘𝑘 ) = 𝑦𝑦𝑘𝑘 + 𝑒𝑒𝑘𝑘

 Error/Deviation/Residuals can be determined as:

Error = Modeled Value – Measured Value → 𝑒𝑒𝑘𝑘 = 𝑓𝑓(𝑥𝑥𝑘𝑘 ) − 𝑦𝑦𝑘𝑘

∑𝑛𝑛
𝑘𝑘=1 𝑓𝑓 𝑥𝑥𝑘𝑘 −𝑦𝑦𝑘𝑘
 Average error → 𝐸𝐸𝑎𝑎 =
𝑛𝑛

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Regression:
 Infinite combinations of lines which can be drawn to these points

x y 8

0 2
6

2 3.2
4

y
4 3.8

5 4.6 2

8 6.2
0
0 2 4 6 8 10
x
9 6.8

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Types of Regression Analysis:
 Linear Regression: Linear Model is used to fit the data
1) Simple: Linear Model with one independent variable → y = 𝑎𝑎0 + 𝑎𝑎1 𝑥𝑥

𝑥𝑥 = Data Points, 𝑎𝑎0 = Unknown Model Coefficients

2) Polynomial: Model to fit the data in a higher order polynomial

Let the no. of independent variables be two, then → y = 𝑎𝑎0 + 𝑎𝑎1 𝑥𝑥 + 𝑎𝑎2 𝑥𝑥 2

For 𝑚𝑚𝑡𝑡𝑡 order polynomial → y = 𝑎𝑎0 + 𝑎𝑎1 𝑥𝑥 + 𝑎𝑎2 𝑥𝑥 2 + ⋯ + 𝑎𝑎𝑚𝑚 𝑥𝑥 𝑚𝑚

3) Multiple: Linear model with two or more independent variables

Let the no. of independent variables be two, then → y = 𝑎𝑎0 + 𝑎𝑎1 𝑥𝑥1 + 𝑎𝑎2 𝑥𝑥2

For 𝑚𝑚 independent variables → y = 𝑎𝑎0 + 𝑎𝑎1 𝑥𝑥1 + 𝑎𝑎2 𝑥𝑥2 + ⋯ + 𝑎𝑎𝑚𝑚 𝑥𝑥𝑚𝑚
Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Quantification of Error in Regression:
y
x Main Objective
(Measured)
8
1 0.5

2 2.5 6

3 2
4

y
4 4
2
5 3.5

6 6 0
0 2 4 6 8 10
x
7 5.5

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Quantification of Error in Regression:
y y
x 𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸𝐸 𝑒𝑒 = 𝑦𝑦𝑖𝑖,𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 − 𝑦𝑦𝑚𝑚𝑜𝑜𝑜𝑜𝑜𝑜𝑜𝑜
(Measured) (Model)

1 0.5 0.91 𝑦𝑦𝑖𝑖,𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 = 𝑎𝑎0 + 𝑎𝑎1 𝑥𝑥𝑖𝑖


2 2.5 1.75
𝑒𝑒𝑖𝑖 = 𝑦𝑦𝑖𝑖 − 𝑎𝑎0 − 𝑎𝑎1 𝑥𝑥𝑖𝑖
3 2 2.59
Regression Model:
4 4 3.43

5 3.5 4.27 𝑦𝑦 = 𝑎𝑎0 + 𝑎𝑎1 𝑥𝑥 + 𝑒𝑒

6 6 5.11 𝑎𝑎0 = 0.07, 𝑎𝑎1 = 0.84


7 5.5 5.95

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Quantification of Error in Regression:

Minimizes the Sum of the Residuals


Minimizes the Sum of the Absolute Values of
the Residuals

Minimizes the Maximum Error of any Individual Point.


Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Quantification of Error in Regression:
Measured

𝟐𝟐
Not 𝒓𝒓, go for 𝒓𝒓
Modeled
6

y
2

0
0 1 2 3 4 5 6 7 8 9
x
Residuals
1

0.5

-0.5

-1
0 1 2 3 4 5 6 7 8 9

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Linear Regression:
 Sum of squares of residuals for 𝑛𝑛 data points
𝑛𝑛 𝑛𝑛 𝑛𝑛

𝑀𝑀𝑀𝑀𝑀𝑀 𝑆𝑆𝑟𝑟 = � 𝑒𝑒𝑖𝑖2 = �(𝑦𝑦𝑖𝑖,𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 −𝑦𝑦𝑖𝑖,𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 )2 = �(𝑦𝑦𝑖𝑖 −𝑎𝑎0 − 𝑎𝑎1 𝑥𝑥𝑖𝑖 )2


𝑖𝑖=1 𝑖𝑖=1 𝑖𝑖=1

 Differentiating 𝑆𝑆𝑟𝑟 equation w.r.t each unknown coefficients of polynomial

𝜕𝜕𝑆𝑆𝑟𝑟 𝜕𝜕𝑆𝑆𝑟𝑟
= −2 �(𝑦𝑦𝑖𝑖 −𝑎𝑎0 − 𝑎𝑎1 𝑥𝑥𝑖𝑖 ) = 0 = −2 � 𝑥𝑥𝑖𝑖 (𝑦𝑦𝑖𝑖 −𝑎𝑎0 − 𝑎𝑎1 𝑥𝑥𝑖𝑖 ) = 0
𝜕𝜕𝑎𝑎𝑜𝑜 𝜕𝜕𝑎𝑎1

− � 𝑦𝑦𝑖𝑖 + � 𝑎𝑎0 + � 𝑎𝑎1 𝑥𝑥𝑖𝑖 = 0 − � 𝑥𝑥𝑖𝑖 𝑦𝑦𝑖𝑖 + � 𝑥𝑥𝑖𝑖 𝑎𝑎0 + � 𝑎𝑎1 𝑥𝑥𝑖𝑖 2 = 0

� 𝑎𝑎0 + � 𝑎𝑎1 𝑥𝑥𝑖𝑖 = � 𝑦𝑦𝑖𝑖 � 𝑥𝑥𝑖𝑖 𝑎𝑎0 + � 𝑎𝑎1 𝑥𝑥𝑖𝑖 2 = � 𝑥𝑥𝑖𝑖 𝑦𝑦𝑖𝑖

𝑛𝑛𝑎𝑎0 + 𝑎𝑎1 � 𝑥𝑥𝑖𝑖 = � 𝑦𝑦𝑖𝑖 𝑎𝑎0 � 𝑥𝑥𝑖𝑖 + 𝑎𝑎1 � 𝑥𝑥𝑖𝑖 2 = � 𝑥𝑥𝑖𝑖 𝑦𝑦𝑖𝑖

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Linear Regression:
 Simultaneous linear equations

𝑛𝑛𝑛𝑛0 + 𝑎𝑎1 � 𝑥𝑥𝑖𝑖 = � 𝑦𝑦𝑖𝑖 𝑛𝑛 � 𝑥𝑥𝑖𝑖 � 𝑦𝑦𝑖𝑖


𝑎𝑎0
→ 𝑎𝑎1 =
𝑎𝑎0 � 𝑥𝑥𝑖𝑖 + 𝑎𝑎1 � 𝑥𝑥𝑖𝑖 2 = � 𝑥𝑥𝑖𝑖 𝑦𝑦𝑖𝑖 � 𝑥𝑥𝑖𝑖 � 𝑥𝑥𝑖𝑖 2 � 𝑥𝑥𝑖𝑖 𝑦𝑦𝑖𝑖

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Linear Regression (Example):
No. x y x2 xy 𝑛𝑛 � 𝑥𝑥𝑖𝑖 � 𝑦𝑦𝑖𝑖
𝑎𝑎0
1 1 4 1 4 𝑎𝑎1 =
� 𝑥𝑥𝑖𝑖 � 𝑥𝑥𝑖𝑖 2 � 𝑥𝑥𝑖𝑖 𝑦𝑦𝑖𝑖
2 3 5 9 15
9 85 𝑎𝑎0 62
3 5 6 25 30 =
85 1077 𝑎𝑎1 686
4 7 5 49 35
9 85
∆= = 9693 − 7225 = 2468
5 10 8 100 80 85 1077
6 12 7 144 84 62 85
7 13 6 169 78 𝑎𝑎0 = 686 1077 = 66774 − 58310 = 8464 = 3.43
∆ 2468 2468
8 16 9 256 144
9 18 12 324 216 9 62
𝑎𝑎1 = 85 686 = 6174 − 5270 = 904 = 0.37
9 85 62 1077 686 ∆ 2468 2468

Dr. Muhammad Majid Gulzar (CIE-KFUPM) y = 𝑎𝑎0 + 𝑎𝑎1 𝑥𝑥 = 3.43 + 0.37𝑥𝑥


Quantification of Error of LR:
Total Standard Deviation:
 Quantifies the “goodness” of a fit.
𝑆𝑆𝑡𝑡
1) 𝑆𝑆𝑡𝑡 : Magnitude of the residual error associated with the 𝑆𝑆𝑦𝑦 =
𝑛𝑛 − 1
𝑛𝑛
dependent variable w.r.t the mean → 𝑆𝑆𝑡𝑡 = ∑𝑖𝑖=1 � 2,
( 𝑦𝑦𝑖𝑖 − 𝑦𝑦) Standard Deviation for the Estimate:

𝑦𝑦�=mean (y) 𝑆𝑆𝑟𝑟 𝑆𝑆𝑟𝑟


𝑆𝑆𝑦𝑦⁄𝑥𝑥 = 𝑆𝑆𝑦𝑦⁄𝑥𝑥 =
𝑛𝑛 − 2 𝑛𝑛 − (𝑚𝑚 + 1)
2) 𝑆𝑆𝑟𝑟 : The sum of the squares of the residuals around the
𝑛𝑛 Linear Regression
regression line → 𝑆𝑆𝑟𝑟 = ∑𝑖𝑖=1( 𝑦𝑦𝑖𝑖 − 𝑦𝑦𝑖𝑖,𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 )2
Poly/Multi Regression

3) (𝑆𝑆𝑡𝑡 − 𝑆𝑆𝑟𝑟 ): Quantifies the improvement or error reduction


due to describing data in terms of a straight line rather than
as an average value.
𝑆𝑆𝑡𝑡 −𝑆𝑆𝑟𝑟
4) 𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶𝐶 𝑜𝑜𝑜𝑜 𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑𝑑 (𝑟𝑟 2 ): → 𝑟𝑟 2 = 𝑆𝑆𝑡𝑡
Dr. Muhammad Majid Gulzar (CIE-KFUPM)
Quantification of Error of LR:
Linear Regression with Large Residual Error
Error w.r.t Mean Value

Error w.r.t Regression Line

Linear Regression with Small Residual Error

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Linear Regression (Example):
𝑛𝑛 𝑛𝑛
x y ymodel (y-ymean )2 (y-ymodel )2 � 2 = 48.87,
𝑆𝑆𝑡𝑡 = �( 𝑦𝑦𝑖𝑖 − 𝑦𝑦) 𝑆𝑆𝑟𝑟 = �( 𝑦𝑦𝑖𝑖 − 𝑦𝑦𝑖𝑖,𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 )2 = 12.12
𝑖𝑖=1 𝑖𝑖=1
1 4 3.8 8.35 0.04
𝑆𝑆𝑡𝑡 − 𝑆𝑆𝑟𝑟 48.87 − 12.12
3 5 4.54 3.57 0.21 𝑟𝑟 2 = = = 0.75
𝑆𝑆𝑡𝑡 48.87
5 6 5.28 0.79 0.52
7 5 6.02 3.57 1.04 14 Measured
Modeled
Linear: y = 0.37*x + 3.4
10 8 7.13 1.23 0.76 12 R
2
= 0.75

12 7 7.87 0.01 0.76 10

8
13 6 8.24 0.79 5.02

y
6
16 9 9.35 4.45 0.12
4
18 12 10.09 26.11 3.65
2
�=6.89
𝒚𝒚 𝑺𝑺𝒕𝒕 =48.87 𝑺𝑺𝒓𝒓 =12.12
0
5 10 15 20
Dr. Muhammad Majid Gulzar (CIE-KFUPM) x
Linear Regression (Class Activity):
No. x y x2 xy 𝑛𝑛 � 𝑥𝑥𝑖𝑖 � 𝑦𝑦𝑖𝑖
𝑎𝑎0
𝑎𝑎1 =
1 1 0.5 1 0.5 � 𝑥𝑥𝑖𝑖 � 𝑥𝑥𝑖𝑖 2 � 𝑥𝑥𝑖𝑖 𝑦𝑦𝑖𝑖

2 2 2.5 4 5.0
7 28 𝑎𝑎0 24
=
3 3 2.0 9 6.0 28 140 𝑎𝑎1 119.5

4 4 4.0 16 16.0 7 28
∆= = 980 − 784 = 196
28 140
5 5 3.5 25 17.5
24 28
6 6 6.0 36 36.0 3360 − 3346 14
𝑎𝑎0 = 119.5 140 = = = 0.071
∆ 196 196
7 7 5.5 49 38.5
7 28 24 140 119.5 7 24
𝑎𝑎1 = 28 119.5 = 836.5 − 672 = 164.5 = 0.839
∆ 196 196

Dr. Muhammad Majid Gulzar (CIE-KFUPM) y = 𝑎𝑎0 + 𝑎𝑎1 𝑥𝑥 = 0.071 + 0.839𝑥𝑥


Linear Regression (Class Activity):
𝑛𝑛 𝑛𝑛
x y ymodel (y-ymean)2 (y-ymodel)2 � 2 = 22.71,
𝑆𝑆𝑡𝑡 = �( 𝑦𝑦𝑖𝑖 − 𝑦𝑦) 𝑆𝑆𝑟𝑟 = �( 𝑦𝑦𝑖𝑖 − 𝑦𝑦𝑖𝑖,𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 )2 = 2.99
𝑖𝑖=1 𝑖𝑖=1
1 0.5 0.91 8.58 0.16
Total Standard Deviation:
2 2.5 1.75 0.86 0.56
𝑆𝑆𝑡𝑡 22.71
3 2.0 2.59 2.04 0.34 𝑆𝑆𝑦𝑦 = = = 1.95
𝑛𝑛 − 1 7−1
4 4.0 3.43 0.33 0.33
Standard error of the Estimate:
5 3.5 4.27 0.01 0.59
6 6.0 5.11 6.61 0.80 𝑆𝑆𝑟𝑟 2.99
𝑆𝑆𝑦𝑦⁄𝑥𝑥 = = = 0.77
𝑛𝑛 − 2 7−2
7 5.5 5.94 4.29 0.20
Correlation Coefficient:
�=3.43
𝒚𝒚 𝑺𝑺𝒕𝒕 =22.71 𝑺𝑺𝒓𝒓 =2.99
𝑆𝑆𝑡𝑡 − 𝑆𝑆𝑟𝑟 22.71 − 2.99
𝑟𝑟 2 = = = 0.87
𝑆𝑆𝑡𝑡 22.71

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Polynomial Regression (Sec 17.2)
Polynomial Regression:
 Sum of squares of residuals for 𝑛𝑛 data points
𝑛𝑛 𝑛𝑛 𝑛𝑛

𝑀𝑀𝑀𝑀𝑀𝑀 𝑆𝑆𝑟𝑟 = � 𝑒𝑒𝑖𝑖2 = �(𝑦𝑦𝑖𝑖,𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 −𝑦𝑦𝑖𝑖,𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 )2 = �(𝑦𝑦𝑖𝑖 −𝑎𝑎0 − 𝑎𝑎1 𝑥𝑥𝑖𝑖 − 𝑎𝑎2 𝑥𝑥𝑖𝑖 2 )2
𝑖𝑖=1 𝑖𝑖=1 𝑖𝑖=1

 Differentiating 𝑆𝑆𝑟𝑟 equation w.r.t each unknown coefficients of polynomial

𝜕𝜕𝑆𝑆𝑟𝑟 𝜕𝜕𝑆𝑆𝑟𝑟 𝜕𝜕𝑆𝑆𝑟𝑟


= −2 �(𝑦𝑦𝑖𝑖 −𝑎𝑎0 − 𝑎𝑎1 𝑥𝑥𝑖𝑖 − 𝑎𝑎2 𝑥𝑥𝑖𝑖 2 ) = 0 = −2 � 𝑥𝑥𝑖𝑖 (𝑦𝑦𝑖𝑖 − 𝑎𝑎0 − 𝑎𝑎1 𝑥𝑥𝑖𝑖 − 𝑎𝑎2 𝑥𝑥𝑖𝑖 2 ) = 0 = −2 � 𝑥𝑥𝑖𝑖 2 (𝑦𝑦𝑖𝑖 − 𝑎𝑎0 − 𝑎𝑎1 𝑥𝑥𝑖𝑖 − 𝑎𝑎2 𝑥𝑥𝑖𝑖 2 ) = 0
𝜕𝜕𝑎𝑎𝑜𝑜 𝜕𝜕𝑎𝑎1 𝜕𝜕𝜕𝜕2

− � 𝑦𝑦𝑖𝑖 + � 𝑎𝑎0 + � 𝑎𝑎1 𝑥𝑥𝑖𝑖 + � 𝑎𝑎2 𝑥𝑥𝑖𝑖 2 = 0 − � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖 + � 𝑎𝑎0 𝑥𝑥𝑖𝑖 + � 𝑎𝑎1 𝑥𝑥𝑖𝑖 2 + � 𝑎𝑎2 𝑥𝑥𝑖𝑖 3 = 0 − � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖 2 + � 𝑎𝑎0 𝑥𝑥𝑖𝑖 2 + � 𝑎𝑎1 𝑥𝑥𝑖𝑖 3 + � 𝑎𝑎2 𝑥𝑥𝑖𝑖 4 = 0

𝑛𝑛𝑎𝑎0 + 𝑎𝑎1 � 𝑥𝑥𝑖𝑖 + 𝑎𝑎2 � 𝑥𝑥𝑖𝑖 2 = � 𝑦𝑦𝑖𝑖 𝑎𝑎0 � 𝑥𝑥𝑖𝑖 + 𝑎𝑎1 � 𝑥𝑥𝑖𝑖 2 + 𝑎𝑎2 � 𝑥𝑥𝑖𝑖 3 = � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖 𝑎𝑎0 � 𝑥𝑥𝑖𝑖 2 + 𝑎𝑎1 � 𝑥𝑥𝑖𝑖 3 + 𝑎𝑎2 � 𝑥𝑥𝑖𝑖 4 = � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖 2

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Polynomial Regression:
 For 2 independent variables and 𝑛𝑛 data points
𝑛𝑛𝑛𝑛0 + 𝑎𝑎1 � 𝑥𝑥𝑖𝑖 + 𝑎𝑎2 � 𝑥𝑥𝑖𝑖 2 = � 𝑦𝑦𝑖𝑖 𝑛𝑛 � 𝑥𝑥𝑖𝑖 � 𝑥𝑥𝑖𝑖 2 � 𝑦𝑦𝑖𝑖
𝑎𝑎0
2 3 2 3
𝑎𝑎0 � 𝑥𝑥𝑖𝑖 + 𝑎𝑎1 � 𝑥𝑥𝑖𝑖 + 𝑎𝑎2 � 𝑥𝑥𝑖𝑖 = � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖 → � 𝑥𝑥𝑖𝑖 � 𝑥𝑥𝑖𝑖 � 𝑥𝑥𝑖𝑖 𝑎𝑎1 = � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖
𝑎𝑎2
𝑎𝑎0 � 𝑥𝑥𝑖𝑖 2 + 𝑎𝑎1 � 𝑥𝑥𝑖𝑖 3 + 𝑎𝑎2 � 𝑥𝑥𝑖𝑖 4 = � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖 2 � 𝑥𝑥𝑖𝑖 2 � 𝑥𝑥𝑖𝑖 3 � 𝑥𝑥𝑖𝑖 4 � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖 2

 For 𝑚𝑚 independent variables and 𝑛𝑛 data points


𝑛𝑛𝑛𝑛0 + 𝑎𝑎1 � 𝑥𝑥𝑖𝑖 + 𝑎𝑎2 � 𝑥𝑥𝑖𝑖 2 + ⋯ + 𝑎𝑎𝑚𝑚 � 𝑥𝑥𝑖𝑖 𝑚𝑚 = � 𝑦𝑦𝑖𝑖 𝑛𝑛 � 𝑥𝑥𝑖𝑖 � 𝑥𝑥𝑖𝑖 2 … � 𝑥𝑥𝑖𝑖 𝑚𝑚 � 𝑦𝑦𝑖𝑖

𝑎𝑎0 � 𝑥𝑥𝑖𝑖 + 𝑎𝑎1 � 𝑥𝑥𝑖𝑖 2 + 𝑎𝑎2 � 𝑥𝑥𝑖𝑖 3 + ⋯ + 𝑎𝑎𝑚𝑚 � 𝑥𝑥𝑖𝑖 𝑚𝑚+1 = � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖 � 𝑥𝑥𝑖𝑖 � 𝑥𝑥𝑖𝑖 2 � 𝑥𝑥𝑖𝑖 3 … � 𝑥𝑥𝑖𝑖 𝑚𝑚+1 𝑎𝑎0 � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖
𝑎𝑎1
→ 𝑎𝑎2 =
𝑎𝑎0 � 𝑥𝑥𝑖𝑖 2 + 𝑎𝑎1 � 𝑥𝑥𝑖𝑖 3 + 𝑎𝑎2 � 𝑥𝑥𝑖𝑖 4 + ⋯ + 𝑎𝑎𝑚𝑚 � 𝑥𝑥𝑖𝑖 𝑚𝑚+2 = � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖 2 � 𝑥𝑥𝑖𝑖 2 � 𝑥𝑥𝑖𝑖 3 � 𝑥𝑥𝑖𝑖 4 … � 𝑥𝑥𝑖𝑖 𝑚𝑚+2 ⋮ � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖 2
𝑎𝑎𝑚𝑚
⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮
𝑎𝑎0 � 𝑥𝑥𝑖𝑖 𝑚𝑚 + 𝑎𝑎1 � 𝑥𝑥𝑖𝑖 𝑚𝑚+1 + 𝑎𝑎2 � 𝑥𝑥𝑖𝑖 𝑚𝑚+2 + ⋯ + 𝑎𝑎𝑚𝑚 � 𝑥𝑥𝑖𝑖 2𝑚𝑚 = � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖 𝑚𝑚 � 𝑥𝑥𝑖𝑖 𝑚𝑚 � 𝑥𝑥𝑖𝑖 𝑚𝑚+1 � 𝑥𝑥𝑖𝑖 𝑚𝑚+2 … � 𝑥𝑥𝑖𝑖 2𝑚𝑚 � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖 𝑚𝑚

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Polynomial Regression (Example):
No. x y x2 x3 xy x4 x2y 6 15 55
∆= 15 55 225 = 3920
1 0 2.1 0 0 0 0 0 55 225 979
2 1 7.7 1 1 7.7 1 7.7 152.6 15 55
3 2 13.6 4 8 27.2 16 54.4 585.6 55 225
𝑎𝑎0 = 2488.8 225 979 = 9716 = 2.48
4 3 27.2 9 27 81.6 81 244.8 ∆ 3920
5 4 40.9 16 64 163.6 256 654.4
6 152.6 55
6 5 61.1 25 125 305.5 625 1527.5 15 585.6 225
6 15 152.6 55 225 585.6 979 2488.8 𝑎𝑎1 = 55 2488.8 979 = 9248.4 = 2.36
∆ 3920
𝑛𝑛 � 𝑥𝑥𝑖𝑖 � 𝑥𝑥𝑖𝑖 2 � 𝑦𝑦𝑖𝑖
6 15 152.6
𝑎𝑎0 6 15 55 𝑎𝑎0 152.6
� 𝑥𝑥𝑖𝑖 � 𝑥𝑥𝑖𝑖 2 � 𝑥𝑥𝑖𝑖 3 𝑎𝑎1 = � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖 → 15 55 225 𝑎𝑎1 = 585.6
15 55 585.6
𝑎𝑎2 55 225 979 𝑎𝑎2 2488.8
𝑎𝑎2 = 55 225 2488.8 = 7294 = 1.86
� 𝑥𝑥𝑖𝑖 2 � 𝑥𝑥𝑖𝑖 3 � 𝑥𝑥𝑖𝑖 4 � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑖𝑖 2 ∆ 3920

Dr. Muhammad Majid Gulzar (CIE-KFUPM) y = 𝑎𝑎0 + 𝑎𝑎1 𝑥𝑥 + 𝑎𝑎2 𝑥𝑥 2 = 2.48 + 2.36𝑥𝑥 + 1.86𝑥𝑥 2
Polynomial Regression (Example):
𝑛𝑛 𝑛𝑛
x1 y ymodel (y-ymean)2 (y-ymodel)2 � 2 = 2513.39,
𝑆𝑆𝑡𝑡 = �( 𝑦𝑦𝑖𝑖 − 𝑦𝑦) 𝑆𝑆𝑟𝑟 = �( 𝑦𝑦𝑖𝑖 − 𝑦𝑦𝑖𝑖,𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 )2 = 3.74
𝑖𝑖=1 𝑖𝑖=1
0 2.1 2.48 544.29 0.14
𝑆𝑆𝑡𝑡 − 𝑆𝑆𝑟𝑟 2513.39 − 3.74
𝑟𝑟 2 = = = 0.99
1 7.7 6.7 314.35 1 𝑆𝑆𝑡𝑡 2513.39

2 13.6 14.64 139.95 1.08


60
3 27.2 26.30 3.13 0.81 Quadratic: y = 1.86*x
2
+ 2.36*x + 2.48
2
50 R = 0.999
4 40.9 41.68 239.32 0.61
40
5 61.1 60.78 1272.35 0.1

y
30
�=25.43
𝒚𝒚 𝑺𝑺𝒕𝒕 =2513.39 𝑺𝑺𝒓𝒓 =3.74
20

10
Measured
Modeled
0
0 1 2 3 4 5 6
Dr. Muhammad Majid Gulzar (CIE-KFUPM) x
Polynomial Regression (Example):
𝑛𝑛 𝑛𝑛
x1 y ymodel (y-ymean)2 (y-ymodel)2 � 2 = 2513.39,
𝑆𝑆𝑡𝑡 = �( 𝑦𝑦𝑖𝑖 − 𝑦𝑦) 𝑆𝑆𝑟𝑟 = �( 𝑦𝑦𝑖𝑖 − 𝑦𝑦𝑖𝑖,𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 )2 = 3.74
𝑖𝑖=1 𝑖𝑖=1
0 2.1 2.48 544.29 0.14
Standard error of the Estimate: (𝑚𝑚 = 2)
1 7.7 6.7 314.35 1
2 13.6 14.64 139.95 1.08 𝑆𝑆𝑟𝑟 3.74
𝑆𝑆𝑦𝑦⁄𝑥𝑥 = = = 1.12
𝑛𝑛 − (𝑚𝑚 + 1) 6−3
3 27.2 26.30 3.13 0.81
4 40.9 41.68 239.32 0.61 Correlation Coefficient:

𝑆𝑆𝑡𝑡 − 𝑆𝑆𝑟𝑟 2513.39 − 3.74


5 61.1 60.78 1272.35 0.1 𝑟𝑟 2 = = = 0.99
𝑆𝑆𝑡𝑡 2513.39
�=25.43
𝒚𝒚 𝑺𝑺𝒕𝒕 =2513.39 𝑺𝑺𝒓𝒓 =3.74

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Multiple Linear Regression (Sec 17.3)
Multiple Linear Regression:
 Sum of squares of residuals for 𝑛𝑛 data points
𝑛𝑛 𝑛𝑛 𝑛𝑛

𝑀𝑀𝑀𝑀𝑀𝑀 𝑆𝑆𝑟𝑟 = � 𝑒𝑒𝑖𝑖2 = �(𝑦𝑦𝑖𝑖,𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 −𝑦𝑦𝑖𝑖,𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 )2 = �(𝑦𝑦𝑖𝑖 −𝑎𝑎0 − 𝑎𝑎1 𝑥𝑥1𝑖𝑖 − 𝑎𝑎2 𝑥𝑥2𝑖𝑖 )2
𝑖𝑖=1 𝑖𝑖=1 𝑖𝑖=1

 Differentiating 𝑆𝑆𝑟𝑟 equation w.r.t each unknown coefficients of polynomial

𝜕𝜕𝑆𝑆𝑟𝑟 𝜕𝜕𝑆𝑆𝑟𝑟 𝜕𝜕𝑆𝑆𝑟𝑟


= −2 �(𝑦𝑦𝑖𝑖 −𝑎𝑎0 − 𝑎𝑎1 𝑥𝑥1𝑖𝑖 − 𝑎𝑎2 𝑥𝑥2𝑖𝑖 ) = 0 = −2 � 𝑥𝑥1𝑖𝑖 (𝑦𝑦𝑖𝑖 − 𝑎𝑎0 − 𝑎𝑎1 𝑥𝑥1𝑖𝑖 − 𝑎𝑎2 𝑥𝑥2𝑖𝑖 ) = 0 = −2 � 𝑥𝑥2𝑖𝑖 (𝑦𝑦𝑖𝑖 − 𝑎𝑎0 − 𝑎𝑎1 𝑥𝑥1𝑖𝑖 − 𝑎𝑎2 𝑥𝑥2𝑖𝑖 ) = 0
𝜕𝜕𝑎𝑎𝑜𝑜 𝜕𝜕𝑎𝑎1 𝜕𝜕𝜕𝜕2

− � 𝑦𝑦𝑖𝑖 + � 𝑎𝑎0 + � 𝑎𝑎1 𝑥𝑥1𝑖𝑖 + � 𝑎𝑎2 𝑥𝑥2𝑖𝑖 = 0 − � 𝑦𝑦𝑖𝑖 𝑥𝑥1𝑖𝑖 + � 𝑎𝑎0 𝑥𝑥1𝑖𝑖 + � 𝑎𝑎1 𝑥𝑥1𝑖𝑖 2 + � 𝑎𝑎2 𝑥𝑥1𝑖𝑖 𝑥𝑥2𝑖𝑖 = 0 − � 𝑦𝑦𝑖𝑖 𝑥𝑥2𝑖𝑖 + � 𝑎𝑎0 𝑥𝑥2𝑖𝑖 + � 𝑎𝑎1 𝑥𝑥2𝑖𝑖 𝑥𝑥1𝑖𝑖 + � 𝑎𝑎2 𝑥𝑥2𝑖𝑖 2 = 0

𝑛𝑛𝑎𝑎0 + 𝑎𝑎1 � 𝑥𝑥1𝑖𝑖 + 𝑎𝑎2 � 𝑥𝑥2𝑖𝑖 = � 𝑦𝑦𝑖𝑖 𝑎𝑎0 � 𝑥𝑥1𝑖𝑖 + 𝑎𝑎1 � 𝑥𝑥1𝑖𝑖 2 + 𝑎𝑎2 � 𝑥𝑥1𝑖𝑖 𝑥𝑥2𝑖𝑖 = � 𝑦𝑦𝑖𝑖 𝑥𝑥1𝑖𝑖 𝑎𝑎0 � 𝑥𝑥2𝑖𝑖 + 𝑎𝑎1 � 𝑥𝑥2𝑖𝑖 𝑥𝑥1𝑖𝑖 + 𝑎𝑎2 � 𝑥𝑥2𝑖𝑖 2 = � 𝑦𝑦𝑖𝑖 𝑥𝑥2𝑖𝑖

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Multiple Linear Regression:
 For 2 independent variables and 𝑛𝑛 data points
𝑛𝑛𝑛𝑛0 + 𝑎𝑎1 � 𝑥𝑥1𝑖𝑖 + 𝑎𝑎2 � 𝑥𝑥2𝑖𝑖 = � 𝑦𝑦𝑖𝑖 𝑛𝑛 � 𝑥𝑥1𝑖𝑖 � 𝑥𝑥2𝑖𝑖 � 𝑦𝑦𝑖𝑖
𝑎𝑎0
2 2
𝑎𝑎0 � 𝑥𝑥1𝑖𝑖 + 𝑎𝑎1 � 𝑥𝑥1𝑖𝑖 + 𝑎𝑎2 � 𝑥𝑥1𝑖𝑖 𝑥𝑥2𝑖𝑖 = � 𝑦𝑦𝑖𝑖 𝑥𝑥1𝑖𝑖 → � 𝑥𝑥1𝑖𝑖 � 𝑥𝑥1𝑖𝑖 � 𝑥𝑥1𝑖𝑖 𝑥𝑥2𝑖𝑖 𝑎𝑎1 = � 𝑦𝑦𝑖𝑖 𝑥𝑥1𝑖𝑖
𝑎𝑎2
𝑎𝑎0 � 𝑥𝑥2𝑖𝑖 + 𝑎𝑎1 � 𝑥𝑥2𝑖𝑖 𝑥𝑥1𝑖𝑖 + 𝑎𝑎2 � 𝑥𝑥2𝑖𝑖 2 = � 𝑦𝑦𝑖𝑖 𝑥𝑥2𝑖𝑖 � 𝑥𝑥2𝑖𝑖 � 𝑥𝑥2𝑖𝑖 𝑥𝑥1𝑖𝑖 � 𝑥𝑥2𝑖𝑖 2 � 𝑦𝑦𝑖𝑖 𝑥𝑥2𝑖𝑖

 For 𝑚𝑚 independent variables and 𝑛𝑛 data points


𝑛𝑛𝑛𝑛0 + 𝑎𝑎1 � 𝑥𝑥1𝑖𝑖 + 𝑎𝑎2 � 𝑥𝑥2𝑖𝑖 + ⋯ + 𝑎𝑎𝑚𝑚 � 𝑥𝑥𝑚𝑚𝑚𝑚 = � 𝑦𝑦𝑖𝑖 𝑛𝑛 � 𝑥𝑥1𝑖𝑖 � 𝑥𝑥2𝑖𝑖 … � 𝑥𝑥𝑚𝑚𝑚𝑚 � 𝑦𝑦𝑖𝑖

𝑎𝑎0 � 𝑥𝑥1𝑖𝑖 + 𝑎𝑎1 � 𝑥𝑥1𝑖𝑖 2 + 𝑎𝑎2 � 𝑥𝑥1𝑖𝑖 𝑥𝑥2𝑖𝑖 + ⋯ + 𝑎𝑎𝑚𝑚 � 𝑥𝑥1𝑖𝑖 𝑥𝑥𝑚𝑚𝑚𝑚 = � 𝑦𝑦𝑖𝑖 𝑥𝑥1𝑖𝑖 � 𝑥𝑥1𝑖𝑖 � 𝑥𝑥1𝑖𝑖 2 � 𝑥𝑥1𝑖𝑖 𝑥𝑥2𝑖𝑖 … � 𝑥𝑥1𝑖𝑖 𝑥𝑥𝑚𝑚𝑚𝑚 𝑎𝑎0 � 𝑦𝑦𝑖𝑖 𝑥𝑥1𝑖𝑖
𝑎𝑎1
→ 𝑎𝑎2 =
𝑎𝑎0 � 𝑥𝑥2𝑖𝑖 + 𝑎𝑎1 � 𝑥𝑥2𝑖𝑖 𝑥𝑥1𝑖𝑖 + 𝑎𝑎2 � 𝑥𝑥2𝑖𝑖 2 + ⋯ + 𝑎𝑎𝑚𝑚 � 𝑥𝑥2𝑖𝑖 𝑥𝑥𝑚𝑚𝑚𝑚 = � 𝑦𝑦𝑖𝑖 𝑥𝑥2𝑖𝑖 � 𝑥𝑥2𝑖𝑖 � 𝑥𝑥2𝑖𝑖 𝑥𝑥1𝑖𝑖 � 𝑥𝑥2𝑖𝑖 2 … � 𝑥𝑥2𝑖𝑖 𝑥𝑥𝑚𝑚𝑚𝑚 ⋮ � 𝑦𝑦𝑖𝑖 𝑥𝑥2𝑖𝑖
𝑎𝑎𝑚𝑚
⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮
𝑎𝑎0 � 𝑥𝑥𝑚𝑚𝑖𝑖 + 𝑎𝑎1 � 𝑥𝑥1𝑖𝑖 𝑥𝑥𝑚𝑚𝑚𝑚 + 𝑎𝑎2 � 𝑥𝑥2𝑖𝑖 𝑥𝑥𝑚𝑚𝑚𝑚 + ⋯ + 𝑎𝑎𝑚𝑚 � 𝑥𝑥𝑚𝑚𝑚𝑚 2 = � 𝑦𝑦𝑖𝑖 𝑥𝑥2𝑖𝑖 � 𝑥𝑥𝑚𝑚𝑚𝑚 � 𝑥𝑥1𝑖𝑖 𝑥𝑥𝑚𝑚𝑚𝑚 � 𝑥𝑥2𝑖𝑖 𝑥𝑥𝑚𝑚𝑚𝑚 … � 𝑥𝑥𝑚𝑚𝑚𝑚 2 � 𝑦𝑦𝑖𝑖 𝑥𝑥𝑚𝑚𝑖𝑖

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Multiple Linear Regression (Example):
No. x1 x2 y x12 x1x2 x1y x22 x2y 6 4 18
∆= 4 6 16 = 344
1 0 0 14 0 0 0 0 0 18 16 76
2 0 2 21 0 0 0 4 42 104 4 18
3 1 2 11 1 2 11 4 22 58 6 16
𝑎𝑎0 = 342 16 76 = 4824 = 14.02
4 2 4 12 4 8 24 16 48 ∆ 344
5 0 4 23 0 0 0 16 92
6 104 18
6 1 6 23 1 6 23 36 138 4 58 16
6 4 18 104 6 16 58 76 342 𝑎𝑎1 = 18 342 76 = −2216 = −6.44
∆ 344
𝑛𝑛 � 𝑥𝑥1𝑖𝑖 � 𝑥𝑥2𝑖𝑖 � 𝑦𝑦𝑖𝑖
6 4 104
𝑎𝑎0 6 4 18 𝑎𝑎0 104
� 𝑥𝑥1𝑖𝑖 � 𝑥𝑥1𝑖𝑖 2 � 𝑥𝑥1𝑖𝑖 𝑥𝑥2𝑖𝑖 𝑎𝑎1 = � 𝑦𝑦𝑖𝑖 𝑥𝑥1𝑖𝑖 → 4 6 16 𝑎𝑎1 = 58
4 6 58
𝑎𝑎2 18 16 76 𝑎𝑎2 342
𝑎𝑎2 = 18 16 342 = 872 = 2.53
� 𝑥𝑥2𝑖𝑖 � 𝑥𝑥2𝑖𝑖 𝑥𝑥1𝑖𝑖 � 𝑥𝑥2𝑖𝑖 2 � 𝑦𝑦𝑖𝑖 𝑥𝑥2𝑖𝑖 ∆ 344

Dr. Muhammad Majid Gulzar (CIE-KFUPM) y = 𝑎𝑎0 + 𝑎𝑎1 𝑥𝑥1 + 𝑎𝑎2 𝑥𝑥2 = 14.02 − 6.44𝑥𝑥1 + 2.53𝑥𝑥2
Multiple Linear Regression (Example):
𝑛𝑛 𝑛𝑛
x1 x2 y ymodel (y-ymean)2 (y-ymodel)2 � 2 = 157.34,
𝑆𝑆𝑡𝑡 = �( 𝑦𝑦𝑖𝑖 − 𝑦𝑦) 𝑆𝑆𝑟𝑟 = �( 𝑦𝑦𝑖𝑖 − 𝑦𝑦𝑖𝑖,𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 )2 = 8.29
𝑖𝑖=1 𝑖𝑖=1
0 0 14 14.02 11.09 0.00
𝑆𝑆𝑡𝑡 − 𝑆𝑆𝑟𝑟 157.34 − 8.29
𝑟𝑟 2 = = = 0.95
0 2 21 19.08 13.47 3.69 𝑆𝑆𝑡𝑡 157.34

1 2 11 12.64 40.07 2.69


2 4 12 11.26 28.41 0.55
0 4 23 24.14 32.15 1.3
1 6 23 22.76 32.15 0.06
�=17.33
𝒚𝒚 𝑺𝑺𝒕𝒕 =157.34 𝑺𝑺𝒓𝒓 =8.29

Dr. Muhammad Majid Gulzar (CIE-KFUPM)


Multiple Linear Regression (Class Activity):
No. x1 x2 y x12 x1x2 x1y x22 x2y 6 16.5 14
∆= 16.5 76.25 48 = 3410.5
1 0 0 5 0 0 0 0 0 14 48 54
2 2 1 10 4 2 20 1 10 54 16.5 14
3 2.5 2 9 6.25 5 22.5 4 18 243.5 76.25 48
𝑎𝑎0 = 100 48 54 = 17052.5 = 5
4 1 3 0 1 3 0 9 0 ∆ 3410.5
5 4 6 3 16 24 12 36 18
6 54 14
6 7 2 27 49 14 189 4 54 16.5 243.5 48
6 16.5 14 54 76.25 48 243.5 54 100 𝑎𝑎1 = 14 100 54 = 13642 = 4
∆ 3410.5
𝑛𝑛 � 𝑥𝑥1𝑖𝑖 � 𝑥𝑥2𝑖𝑖 � 𝑦𝑦𝑖𝑖
6 16.5 54
𝑎𝑎0 6 16.5 14 𝑎𝑎0 54
� 𝑥𝑥1𝑖𝑖 � 𝑥𝑥1𝑖𝑖 2 � 𝑥𝑥1𝑖𝑖 𝑥𝑥2𝑖𝑖 𝑎𝑎1 = � 𝑦𝑦𝑖𝑖 𝑥𝑥1𝑖𝑖 → 16.5 76.25 48 𝑎𝑎1 = 243.5
16.5 76.25 243.5
𝑎𝑎2 14 48 54 𝑎𝑎2 100
𝑎𝑎2 = 14 48 100 = −10231.5 = −3
� 𝑥𝑥2𝑖𝑖 � 𝑥𝑥2𝑖𝑖 𝑥𝑥1𝑖𝑖 � 𝑥𝑥2𝑖𝑖 2 � 𝑦𝑦𝑖𝑖 𝑥𝑥2𝑖𝑖 ∆ 3410.5

Dr. Muhammad Majid Gulzar (CIE-KFUPM) y = 𝑎𝑎0 + 𝑎𝑎1 𝑥𝑥1 + 𝑎𝑎2 𝑥𝑥2 = 5 + 4𝑥𝑥1 − 3𝑥𝑥2
Multiple Linear Regression (Class Activity):
𝑛𝑛 𝑛𝑛
x1 x2 y ymodel (y-ymean)2 (y-ymodel)2 � 2 = 458,
𝑆𝑆𝑡𝑡 = �( 𝑦𝑦𝑖𝑖 − 𝑦𝑦) 𝑆𝑆𝑟𝑟 = �( 𝑦𝑦𝑖𝑖 − 𝑦𝑦𝑖𝑖,𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚𝑚 )2 = 0
𝑖𝑖=1 𝑖𝑖=1
0 0 5 5 16 0
Standard error of the Estimate: (𝑚𝑚 = 2)
2 1 10 10 1 0
2.5 2 9 9 0 0 𝑆𝑆𝑟𝑟 0
𝑆𝑆𝑦𝑦⁄𝑥𝑥 = = =0
𝑛𝑛 − (𝑚𝑚 + 1) 6−3
1 3 0 0 81 0
4 6 3 3 36 0 Correlation Coefficient:

𝑆𝑆𝑡𝑡 − 𝑆𝑆𝑟𝑟 458 − 0


7 2 27 27 324 0 𝑟𝑟 2 = = =1
𝑆𝑆𝑡𝑡 458
�=9
𝒚𝒚 𝑺𝑺𝒕𝒕 =458 𝑺𝑺𝒓𝒓 =0

Dr. Muhammad Majid Gulzar (CIE-KFUPM)

You might also like