Linear Regression & Logistic Regression
Linear Regression & Logistic Regression
6
Linear regression
y= mx+c+ ε
• y= Dependent Variable (Target Variable)
• x= Independent Variable (predictor Variable)
• c= y intercept of the line (constant)
• m= slope
• ε= error
7
Linear regression
Three cases
Error Functions
• The Least-Square Error (LSE) states that
the curve that best fits a given set of
observations
• It is said to be a curve having a
minimum sum of the squared residuals
(or deviations or errors) from the given
data points.
Least Square Error
Mean Square Error
• The Mean Squared Error measures how
close a regression line is to a set of
data points.
• It is a risk function corresponding to the
expected value of the squared error loss.
• Mean square error is calculated by taking
the average, specifically the mean, of
errors squared from data as it relates to a
function.
Linear Regression
Input
Example