Linear-Regression ML
Linear-Regression ML
So how do we know which of these lines is the best fit line? That’s the
problem that we will solve in this article. For this, we will first look at the cost
function.
Simple Linear Regression
But how the linear regression finds out
which is the best fit line?
• The goal of the linear regression algorithm is to get the best
values for B0 and B1 to find the best fit line. The best fit line is a
line that has the least error which means the error between
predicted values and actual values should be minimum.
• In regression, the difference between the observed value of the
dependent variable(yi) and the predicted value(predicted) is
called the residuals.
• εi = ypredicted – yi
• where ypredicted = B0 + B1 Xi
Cost Function for Linear Regression
• The cost function helps to work out the optimal values for
B0 and B1, which provides the best fit line for the data points.
• In Linear Regression, generally Mean Squared Error (MSE) cost
function is used, which is the average of squared error that
occurred between the ypredicted and yi.
Cost Function for Linear Regression
• The cost function helps to work out the optimal values for
B0 and B1, which provides the best fit line for the data points.
• In Linear Regression, generally Mean Squared Error (MSE) cost
function is used, which is the average of squared error that
occurred between the ypredicted and yi.
B1
B0
Gradient Descent
Convex and Non-Convex cost function
Gradient Descent
Then there are 2 things that we need
1.Which direction to go (Direction of update)
2.How big step to take (Amount of update )
Gradient Descent
• y = A+B1x1+B2x2+B3x3+B4x4
Multiple Linear Regression
Multiple Linear Regression
Multiple Linear Regression
• Jobs we loose due to ML: https://www.youtube.com/watch?
v=gWmRkYsLzB4&list=PLobzMSC-
raKifQd9vHHPkMam_jrQEyzCX&index=7
• How AI can enhance our memory, work and social lives: https://
youtu.be/DJMhz7JlPvA