0% found this document useful (0 votes)
13 views

REGRESSION

da notes

Uploaded by

pujiswathy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views

REGRESSION

da notes

Uploaded by

pujiswathy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 86

REGRESSION

AGENDA:

1. What is Regression and is there any relevance to ANOVA?


2. Difference between correlation and regression?
3. Difference between regression analysis and regression model
in ML?
4. Different types of regression?
5. What’s the equation of line?
6. What is sum of square error?
7. How does the curve look like for quadratic equation?
8. Optimization in Linear regression (LR)
9. Evaluating LR models?
10. Is there any problem/branch in Data Analytics involving
regression? Credits:
Presentation also
contains slides from
DeepLearning.AI
CORRELATION vs REGRESSION ANOVA vs REGRESSION
Correlation vs Covariance

Pearson's correlation coefficient, when applied


to a population, is commonly represented by
the Greek letter ρ (rho) and is defined as
LINEAR REGRESSION (LR)
A linear regression model can be defined as the Slope: a slope of m means that if you increase
function approximation that represents a continuous the x-value by 1 unit, then the y-value goes up
response variable as a function of one or more by m units; a negative slope means that the
predictor variables. While building a linear
regression model, the goal is to identify a linear
y-value would go down rather than up
equation that best predicts or models the If Y is the outcome variable (the DV) and X is
relationship between the response or dependent the predictor variable (the IV), then the formula
variable (y) and one or more predictor or that describes our regression is written like this:
independent variables (here, x).

The formula for a straight line is usually written like b0 always refers to the intercept term, and b1
this:
y=mx+c
refers to the slope. Refers to the estimate or
the prediction that our regression line is making.
The two variables are x and y, and we have two Difference between the model prediction and
coefficients, m and c. The coefficient m represents that actual data point as a residual, and we’ll
the slope of the line, and the coefficient c represents refer to it as ϵi.
the y-intercept of the line.
Thus our LR model is:
What is sum of squared error (SSE or RSS)?

In the case of a perfect fit, the RSS would be


0, meaning the estimated value is the same as
the actual value
OLS?
The ordinary least squares (OLS) method can be
defined as a linear regression technique that is
used to estimate the unknown parameters in a
model. The method relies on minimizing the sum
of squared residuals between the actual (observed
values of the dependent variable) and predicted
values from the model. The residual can be
defined as the difference between the actual value
and the predicted value. Another word for residual
can be error. The sum of the squared differences is
also known as the residual sum of squares (RSS).
The OLS method minimizes the RSS by finding
the values of the coefficients that result in the
smallest possible RSS. The resulting line is called
the regression line, which represents the best fit
for the data.
Fitting the line – best fit
Types of regression
Types of regression
ALTERNATIVES TO OLS

While OLS is a popular method for estimating linear Lasso Regression


regression models, there are several alternative Lasso regression is similar to ridge regression, but it
methods that can be used depending on the specific adds a penalty term that can result in some of the
requirements of the analysis. Let’s discuss some of coefficients being set to zero. This can help simplify
the popular alternative methods to OLS. the model and reduce the risk of overfitting.
•Ridge regression
•Lasso regression
Elastic Net Regression
•Elastic net regression Elastic net regression is a combination of ridge and
Ridge Regression lasso regression that adds both a L1 and L2 penalty
Ridge regression is a method that adds a penalty term to the OLS cost function. This method can help
term to the OLS cost function to prevent overfitting balance the advantages of both methods and can be
in scenarios where there are many independent particularly useful when there are many independent
variables or the independent variables are highly variables with varying degrees of importance.
correlated. The penalty term, known as the
shrinkage parameter, reduces the magnitude of the
coefficients and can help prevent the model from
being too complex.
Evaluating LR model
We need to able to measure how good our
model is (accuracy). There are many methods to
achieve this but we may use Root mean
squared error and coefficient of
Determination (R² Score).

Root Mean Squared Error is the square root of


the sum of all errors divided by the number of
values, or Mathematically,

Here yj^ is the ith predicted output values


Evaluating the LR model
After fitting the line, what is the significance of
your model?
Hypotheses testing for regression model:

After fitting the line, There are two different (but


related) kinds of hypothesis tests that we use in
our regression model:
• those in which we test whether the regression
model as a whole is performing significantly
better than a null model (F-test); and
• those in which we test whether a particular
regression coefficient is significantly different
from zero(t-test).
Much more frequently, the reasonableness of the
model is indicated by data – a scatter plot
exhibiting a substantial linear pattern.
Applications of regression
Sum of Squared Error –
Let’s start with one powerline, then move on to
two powerlines:
Two Powerlines problem:
In order to find
the minimum
point of a curve,
you find the point
where the slope
of the tangent = 0
To find that, we
take derivative of
the cost function
and equate it to 0
Linear Regression: Analytical
Approach
Linear Regression: Analytical
Approach

(1, 2) (2, 5) (3,3)


Linear Regression: Analytical
Approach

Location on
the x,y plane (1, (2, 5) (3,3)
2)
Linear Regression: Analytical
Approach y

Location on
the x,y plane (1, 2) (2, 5) (3,3)

x
Linear Regression: Analytical
Approach y

Location on
the x,y plane (1, 2) (2, 5) (3,3)

(1,2 )

x
Linear Regression: Analytical
Approach y

Location on
the x,y plane (1, 2) (2, 5) (3,3) (2, 5)

(1,2 )

x
Linear Regression: Analytical
Approach y

Location on
the x,y plane (1, 2) (2, 5) (3,3) (2, 5 )

(3,3)
(1,2 )

x
Linear Regression: Analytical
Approach y

Location on
the x,y plane (1, 2) (2, 5) (3,3) (2, 5 )

(3,3)
(1,2 )

x
Linear Regression: Analytical
Approach y

Location on
the x,y plane (1, 2) (2, 5) (3,3) (2, 5 )

(3,3)
(1,2 )

x
Linear Regression: Analytical
Approach y
Initial line to represent
where to pass the
connection on the x-y
plane

Location on
the x,y plane (1, 2) (2, 5) (3,3) (2, 5)

(3,3)
(1,2 )

x
Linear Regression: Analytical
Approach y
Initial line to represent
where to pass the
connection on the x-y
plane

Location on
the x,y plane (1, 2) (2, 5) (3,3) (2, 5)

(3,3)
(1,2 )

x
Linear Regression: Analytical
Approach y
Initial line to represent
where to pass the
connection on the x-y
plane

Location on
the x,y plane (1, 2) (2, 5) (3,3) (2, 5)

(3,3)
(1,2 )

x
Linear Regression: Analytical
Approach y
Initial line to represent
where to pass the
connection on the x-y
plane

Location on
the x,y plane (1, 2) (2, 5) (3,3) (2, 5)

(3,3)
(1,2 )

x
Linear Regression: Analytical
Approach y
Initial line to represent
where to pass the
connection on the x-y
plane

Location on
the x,y plane (1, 2) (2, 5) (3,3) (2, 5)

The cost of connecting


connection to the
(3,3)
powerline
(1,2 )

x
Linear Regression: Analytical
Approach y
Initial line to represent
where to pass the
connection on the x-y
plane

Location on
the x,y plane (1, 2) (2, 5) (3,3) (2, 5)

The cost of connecting


connection to the
(3,3)
powerline
(1,2 )

x
Linear Regression: Analytical
Approach y
Initial line to represent
where to pass the
connection on the x-y
plane

Location on
the x,y plane (1, 2) (2, 5) (3,3) (2, 5)

The cost of connecting


connection to the
(3,3)
powerline
(1,2 )

x
Linear Regression: Analytical
Approach y
Initial line to represent
where to pass the
connection on the x-y
plane

Location on y = m x +b
the x,y plane (1, 2) (2, 5) (3,3) (2, 5)

The cost of connecting


connection to the
(3,3)
powerline
(1,2 )

x
Linear Regression: Analytical
Approach y
Initial line to represent
where to pass the
connection on the x-y
plane

Location on y =mx + b
the x,y plane (1, 2) (2, 5) (3,3) (2, 5)

The cost of connecting


connection to the
(3,3)
powerline
(1,2 )

Goal: Find m, b such that you x


minimize sum of squares cost
Linear Regression: Analytical
Approach
y

y = m +b
x
(2, 5)

(3,3)
(1,2)

x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b
x
(2, 5)

(3,3)
(1,2)

x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b
x
(2, 5)

(1, m+b)

(3,3)
(1,2)

x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b
x
(2, 5)

(1, m+b)

(3,3)
(1,2) (m + b– 2 )2
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b
x
(2, 5)
(2m + b − 5)2

(3,3)
(1,2) (m + b– 2 )2
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b
x
(2, 5)
(2m + b − 5)2
2
(3m + b− 3)
(3,3)
(1,2) (m + b– 2 )2
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

(m + b − 2)2 + (2m + b − 5)2 + (3m + b -3) 2


y = m +b
x
(2, 5)
(2m + b − 5)2
2
(3m + b− 3)
(3,3)
(1,2) (m + b – 2 )2
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

(m + b − 2)2 + (2m + b − 5)2 + (3m + b -3) 2


y = m +b
x m2 + b2 + 4 + 2mb − 4m − 4b
(2, 5)
(2m + b − 5)2
2
(3m + b− 3)
(3,3)
(1,2) (m + b – 2 )2
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

(m + b − 2)2 + (2m + b − 5)2 + (3m + b -3) 2


y = m +b
x m2 + b2 + 4 + 2mb − 4m − 4b
(2, 5)
(2m + b − 5)2 +4m2 + b2 + 25 + 4mb − 20m − 10b
2
(3m + b− 3)
(3,3)
(1,2) (m + b– 2 )2
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

(m + b − 2)2 + (2m + b − 5)2 + (3m + b-3) 2

y = m +b
x m2 + b2 + 4 + 2mb − 4m − 4b
(2, 5)
(2m + b − 5)2 +4m2 + b2 + 25 + 4mb − 20m − 10b
2
(3m + b− 3) +9m2 + b2 + 9 + 6mb − 18m − 6b
(3,3)
(1,2) (m + b – 2 )2
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

(m + b − 2)2 + (2m + b − 5)2 + (3m + b-3) 2

y = m +b
x m2 + b2 + 4 + 2mb − 4m − 4b
(2, 5)
(2m + b − 5)2 +4m2 + b2 + 25 + 4mb − 20m − 10b
2
(3m + b− 3) +9m2 + b2 + 9 + 6mb − 18m − 6b
(3,3)
(1,2) (m + b – 2 )2
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

(m + b − 2)2 + (2m + b − 5)2 + (3m +b-3) 2

y = m +b
x m2 + b2 + 4 + 2mb − 4m − 4b
(2, 5)
(2m + b − 5)2 +4m2 + b2 + 25 + 4mb − 20m − 10b
2
(3m + b− 3) +9m2 + b2 + 9 + 6mb − 18m − 6b
(3,3)
(1,2) (m + b– 2 )2 E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 2 0 b
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b


x
(2, 5)

(3,3)
(1,2)

x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b


x
(2, 5) ∂E
∂m = 0

(3,3)
(1,2)

x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b


x
(2, 5) ∂E
∂m = 0

(3,3) ∂E
(1,2) ∂b = 0
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b


x
(2, 5) ∂E Quiz:
∂m = 0
Find the partial derivative of
with respect to
(3,3) ∂E
(1,2) ∂b = 0
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b


x
(2, 5)
∂E
∂m = 28m + 12b − 42

(3,3) ∂E
(1,2) ∂b =
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b


x
(2, 5)
∂E
∂m = 28m + 12b − 42

(3,3) ∂E Quiz:
(1,2) ∂b =
Find the partial derivative of
x with respect to
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b


x
(2, 5)
∂E
∂m = 28m + 12b − 42

(3,3) ∂E
(1,2) ∂b = 6b + 12m − 20
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b


x
(2, 5)
∂E
∂m = 28m + 12b − 42

(3,3) ∂E
(1,2) ∂b = 6b + 12m − 20
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b


x
(2, 5) =0
∂E
∂m = 28m + 12b − 42

(3,3) ∂E
(1,2) ∂b = 6b + 12m − 20
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b


x
(2, 5) =0
∂E
∂m = 28m + 12b − 42

(3,3) ∂E =0
(1,2) ∂b = 6b + 12m − 20
x
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b


x
(2, 5) =0
∂E
∂m = 28m + 12b − 42

(3,3) ∂E =0
(1,2) ∂b = 6b + 12m − 20
x
m=
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b


x
(2, 5) =0
∂E
∂m = 28m + 12b − 42

(3,3) ∂E =0
(1,2) ∂b = 6b + 12m − 20
x
m=
b=
Linear Regression: Analytical
Approach
y
Goal: Minimize sum of squares cost

y = m +b E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b


x
(2, 5) =0
∂E
∂m = 28m + 12b − 42

(3,3) ∂E =0
(1,2) ∂b = 6b + 12m − 20
x
m=
b= ?
Linear Regression: Analytical
Approach
Goal: Minimize sum of squares cost

E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b

∂E =0
∂m = 28m + 12b − 42

∂E =0
∂b = 6b + 12m − 20

m=
b= ?
Linear Regression: Analytical
Approach
Goal: Minimize sum of squares cost

E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b

∂E =0
∂m = 28m + 12b − 42

∂E =0
∂b = 6b + 12m − 20

m=
b= ?
Linear Regression: Analytical
Approach
Goal: Minimize sum of squares cost 12b + 24m − 40 = 0

E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b

∂E =0
∂m = 28m + 12b − 42

∂E =0
∂b = 6b + 12m − 20

m=
b= ?
Linear Regression: Analytical
Approach
Goal: Minimize sum of squares cost 12b + 24m − 40 = 0
4m − 2 = 0
E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b

∂E =0
∂m = 28m + 12b − 42

∂E =0
∂b = 6b + 12m − 20

m=
b= ?
Linear Regression: Analytical
Approach
Goal: Minimize sum of squares cost 12b + 24m − 40 = 0
4m − 2 = 0
E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b
2
m = 4= 0.5
∂E =0
∂m = 28m + 12b − 42

∂E =0
∂b = 6b + 12m − 20

m=
b= ?
Linear Regression: Analytical
Approach
Goal: Minimize sum of squares cost 12b + 24m − 40 = 0
4m − 2 = 0
E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b
2
m = 4= 0.5
∂E =0
∂m = 28m + 12b − 42 6b + 12(0.5) − 20 = 0

∂E =0
∂b = 6b + 12m − 20

m=
b= ?
Linear Regression: Analytical
Approach
Goal: Minimize sum of squares cost 12b + 24m − 40 = 0
4m − 2 = 0
E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b
2
m = 4= 0.5
∂E =0
∂m = 28m + 12b − 42 6b + 12(0.5) − 20 = 0

∂E =0 6b + 6 − 20 = 0
∂b = 6b + 12m − 20

m=
b= ?
Linear Regression: Analytical
Approach
Goal: Minimize sum of squares cost 12b + 24m − 40 = 0
4m − 2 = 0
E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b
2
m = 4= 0.5
∂E =0
∂m = 28m + 12b − 42 6b + 12(0.5) − 20 = 0

∂E =0 6b + 6 − 20 = 0
∂b = 6b + 12m − 20
6b − 14 = 0
m=
b= ?
Linear Regression: Analytical
Approach
Goal: Minimize sum of squares cost 12b + 24m − 40 = 0
4m − 2 = 0
E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b
2
m = 4= 0.5
∂E =0
∂m = 28m + 12b − 42 6b + 12(0.5) − 20 = 0

∂E =0 6b + 6 − 20 = 0
∂b = 6b + 12m − 20
6b − 14 = 0
m=
b= ? b=
14
6
7
=

3
Linear Regression: Analytical
Approach
Goal: Minimize sum of squares cost

E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b 1


m= 2

∂E =0 7
∂m = 28m + 12b − 42
b=
3
∂E =0
∂b = 6b + 12m − 20
1 7
E(m = , b = ) ≈ 4.167
2 3
m=
b= ?
Linear Regression: Optimal Solution
y

1
m= 2
y = m +b
x
7
(2, 5) b= 3

(3,3) 1 7
E(m = , b = ) ≈ 4.167
(1,2 )
x
2 3
Linear Regression: Optimal Solution
y

1
m= 2
y = m +b
x
7
(2, 5) b= 3

(3,3) 1 7
E(m = , b = ) ≈ 4.167
(1,2 )
x
2 3
Linear Regression: Optimal Solution
y

1
m= 2
y = m +b
x
(2, 5)
7
b= 3

(3,3)
(1,2) 1 7
E(m = , b = ) ≈ 4.167
x
2 3
Linear Regression: Gradient Descent
Goal: Minimize sum of squares cost

E(m, b) = 14m2 + 3b2 + 38 + 12mb − 42m − 20b

∂E =0
∂m = 28m + 12b − 42
Gradient Descent to the rescue
∂E =0
∂b = 6b + 12m − 20

m=
b= ?
REGRESSION ANALYSIS WAY OF SOLVING THE PROBLEM

Remember we started with the diagram below: Let’s tabulate the points in the graph:
y

X Y
1 2
y = m +b
x 2 5
(2, 5)
3 3

(3,3)
Here we’ll establish the relationship between the
(1,2) predictor X and the target outcome Y. We’d
need to calculate as given in the next slide.
x
REGRESSION ANALYSIS (II): SOLVING USING REGRESSION COEFFICIENT: b(xy) (for
slope)
X Y X*Y dx=X-X‾ dy=Y-Y‾ (dx)² (dy)² dx * dy

1 2 2 -1 -1 1 1 1

2 5 10 0 2 0 4 0

3 3 9 1 0 1 0 0

Total 6 10 21 0 1 2 5 1

Here n= no.of.instances = 3
x̅ (mean of X) = total of X/n = 6/3 = 2
y̅ (mean of Y) = total of Y/n = 10/3 = 3⅓ (let’s round it to 3 for calculating dy)

Now write the regression equation


REGRESSION ANALYSIS USING REGRESSION COEFFICIENTS

N= 3;
X Y X² X dx= dy = (dx)² (dy)² dx * =2
* X-X‾ Y-Y‾ dy
Y
= 3.33
1 2 1 2 -1 -1 1 1 1 Regression equation of Y on X:

2 5 4 10 0 2 0 4 0
Where

3 3 9 9 1 0 1 0 0 when we calculate we would get,

(3 * 21) – (6)(10)
Total 6 10 14 21 0 1 2 5 1
_____________ = 3/6 = ½ = 0.5
(3 * 14) – (6)²

Now substitute this value in regression equation


We’ll get: Y-3.33 = 0.5 (X-2)
REGRESSION ANALYSIS USING REGRESSION COEFFICIENTS

Regression equation: Y-3.33 = 0.5 (X-2)


On expanding it, we’d get,
Y-3.33 = 0.5X – 1.0
X Y dx=X-X‾ dy=Y-Y‾ (dx)² (dy)² dx Y = 0.5X – 1.0 + 3.33
*
dy
Y= 0.5X +2.33 which is in the form of Y= m*X +b
Where m = 0.5
1 2 -1 -1 1 1 1 And b = 2.33 ≈ (7/3)
This is what we arrived previously,
2 5 0 2 0 4 0

3 3 1 0 1 0 0

Total 6 10 0 1 2 5 1
REGRESSION ANALYSIS (III) SOLVED USING COVARIANCE (for
slope)
N= 3;
X Y X² X dx= dy = (dx)² (dy)² dx * =2
* X-X‾ Y-Y‾ dy
Y
= 3.33
1 2 1 2 -1 -1 1 1 1 Regression equation: Y=m*X + b
Where m =
2 5 4 10 0 2 0 4 0

And Cov(X,Y) =>


3 3 9 9 1 0 1 0 0

m = (dx * dy)/ (dx)² = ½ = 0.5 and intercept b =>


Total 6 10 14 21 0 1 2 5 1

in other words b =>mean(Y) – m * mean(X) = 2.33


We’ll get: Y=0.5X + 2.33
References
1. https://vitalflux.com/linear-regression-hypothesis-testing-examples/
2. https://stats.libretexts.org/Bookshelves/Applied_Statistics/Learning_Statistics_with_R_-_A_t
utorial_for_Psychology_Students_and_other_Beginners_(Navarro)/15%3A_Linear_Regressi
on/15.05%3A_Hypothesis_Tests_for_Regression_Models
3. https://www.slideshare.net/AniruddhaDeshmukh2/regression-vs-anova-71033194
4. https://towardsdatascience.com/linear-regression-from-scratch-cd0dee067f72
5. http://www.kem.edu/wp-content/uploads/2012/06/10-Principles_of_regression_analysis-1.pd
f
6. https://www.geeksforgeeks.org/types-of-regression-techniques/
7. DeepLearning.ai

You might also like