0% found this document useful (0 votes)
17 views

Regression

This document provides an overview of linear regression for predicting continuous numeric values. It discusses the learning and prediction processes for linear regression models. The learning process finds the best-fitting linear equation to minimize error between predicted and actual values. This linear equation, with weights and an intercept, is the regression model. The prediction process uses a new data point to predict the output value. Model performance is evaluated using the R-squared score, which indicates how close the predictions are to the actual data.

Uploaded by

jemai.mohamedaze
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views

Regression

This document provides an overview of linear regression for predicting continuous numeric values. It discusses the learning and prediction processes for linear regression models. The learning process finds the best-fitting linear equation to minimize error between predicted and actual values. This linear equation, with weights and an intercept, is the regression model. The prediction process uses a new data point to predict the output value. Model performance is evaluated using the R-squared score, which indicates how close the predictions are to the actual data.

Uploaded by

jemai.mohamedaze
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Artificial Intelligence

Chapter : Regression
Marouane Ben Haj Ayech
Outline
• Presentation
• Linear Regression
• Learning process
• Prediction process
• Evaluation

2
Presentation
• Prediction
Prediction task Description Output Nature Examples
- House price prediction based on
Predicting a continuous numeric value or quantity features like size, location, and
Continuous numeric
Regression based on input features, typically used for predicting age. - Temperature value
values
numerical outcomes. prediction based on historical
data.

• Learning
Learning Type Dataset Type Prediction Tasks Learning models

Linear Regression
Supervised Labeled Regression
Polynomial Regression

3
Presentation
Regression problem
x=house=(Surface, nb rooms) y=price value ∈ ℝ
input output

Prediction of price value


Labeled training dataset for a new house
Model Model
Price Price Prime

Predicted
price
Learning Prediction
process process

(Surface, nb rooms) (Surface, nb rooms) new house =


(150, 3)
4
K Nearest Neighbors (KNN)
Technique Learning Process Prediction Process Hyperparameters
the algorithm finds the
best-fitting linear
equation (a straight line
- Given a new data point x = (x1,x2, x3, …), The basic linear
in simple linear
the output value y is predicted as follows : regression model has no
regression) that
y = w1 * x1 + w2 * x2 + …. + b crucial hyperparameters
minimizes the error
Linear between predicted and
Regression actual values.
Model
There are 2 parameters :
- coefficients (weights) : a vector denoted w = (w1 , w2 , w3, …)
one coefficient for each feature
- intercept denoted b
The model is the linear equation : y = w1 * x1 + w2 * x2 + …. + b
5
EValuation
• The R-squared (R²) score is called the coefficient of determination
• It is a metric used to evaluate the goodness of fit of a regression
model.
• R-squared is a value between 0 and 1 :
• If it is close to 1 , it means that the model's predictions closely match the
actual data
• If is close to 0 , it means that the model is a poor fit for the data.

You might also like