0% found this document useful (0 votes)
2 views

Linear Regression & Logistic Regression

Error based Learning involves initializing a prediction model with random parameters and iteratively adjusting them based on an error function to improve accuracy. Linear regression, multiple linear regression, and logistic regression are key algorithms used in this approach, each suited for different types of predictive analysis. Error functions like Least-Square Error and Mean Square Error are utilized to assess the model's performance.

Uploaded by

watches1432
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Linear Regression & Logistic Regression

Error based Learning involves initializing a prediction model with random parameters and iteratively adjusting them based on an error function to improve accuracy. Linear regression, multiple linear regression, and logistic regression are key algorithms used in this approach, each suited for different types of predictive analysis. Error functions like Least-Square Error and Mean Square Error are utilized to assess the model's performance.

Uploaded by

watches1432
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 30

Error based Learning

Error based Learning


• A parameterised prediction model is initialized
with a set of random parameters
• Error function is used to judge how well the
initial model performs (i.e.) for the predictions
for training dataset
• Based on the value of error functions, the
parameters are iteratively adjusted to create
a more and more accurate model
Error based Learning
• Linear regression
• Multiple variant Linear regression
• Polynomial regression
• Logistic regression
• SVM
• Neural Networks
Linear Regression
In Machine Learning,
• Linear Regression is a supervised machine learning
algorithm.
• It tries to find out the best linear relationship that
describes the data we have.
• It assumes that there exists a linear relationship between
a dependent variable and independent variable(s).
• The value of the dependent variable of a linear regression
model is a continuous value i.e. real numbers.
Linear regression
• Linear regression is one of the easiest and most
popular Machine Learning algorithms.
• It is a statistical method that is used for predictive
analysis.
• Linear regression makes predictions for
continuous/real or numeric variables such as sales,
salary, age, product price, etc.
• Linear regression algorithm shows a linear
relationship between a dependent (y) and one or
more independent (x) variables
5
Linear regression

6
Linear regression

y= mx+c+ ε
• y= Dependent Variable (Target Variable)
• x= Independent Variable (predictor Variable)
• c= y intercept of the line (constant)
• m= slope
• ε= error

7
Linear regression
Three cases
Error Functions
• The Least-Square Error (LSE) states that
the curve that best fits a given set of
observations
• It is said to be a curve having a
minimum sum of the squared residuals
(or deviations or errors) from the given
data points.
Least Square Error
Mean Square Error
• The Mean Squared Error measures how
close a regression line is to a set of
data points.
• It is a risk function corresponding to the
expected value of the squared error loss.
• Mean square error is calculated by taking
the average, specifically the mean, of
errors squared from data as it relates to a
function.
Linear Regression
Input

Example

If x=6, What is the value of Y??


Linear Regression
Linear Regression
R Square Value
Linear Regression

.30769 ~ 30.76% error


Linear Regression
Multiple Linear Regression (MLR)

• Multiple Linear Regression is an


extension of Simple Linear regression as
it takes more than one predictor
variable to predict the response variable
• Multiple Linear Regression is one of the
important regression algorithms which models
the linear relationship between a single
dependent continuous variable and more
than one independent variable
Multiple Linear Regression (MLR)

• For MLR, the dependent or target


variable(Y) must be the continuous/real,
but the predictor or independent variable
may be of continuous or categorical
form.
• Each feature variable must model the
linear relationship with the dependent
variable.
• MLR tries to fit a regression line through
a multidimensional space of data-points
Multiple Linear Regression (MLR)
Logistic Regression
• It is a Supervised Machine Learning
• It is used to solve the bi-classification problem
• It uses Sigmoid function (S shaped curve)
• It gives the probability value to class 0 or 1
Logistic Regression
Example
Example
Types of Logistic regression
• Binomial: In binomial Logistic regression, there can
be only two possible types of the dependent
variables, such as 0 or 1, Pass or Fail, etc.
• Multinomial: In multinomial Logistic regression,
there can be 3 or more possible unordered types
of the dependent variable, such as "cat", "dogs",
or "sheep"
• Ordinal: In ordinal Logistic regression, there can be
3 or more possible ordered types of dependent
variables, such as "low", "Medium", or "High".

You might also like