Mathematical explanation for Linear Regression working Last Updated : 18 Oct, 2022 Comments Improve Suggest changes Like Article Like Report Suppose we are given a dataset: Given is a Work vs Experience dataset of a company and the task is to predict the salary of a employee based on his / her work experience. This article aims to explain how in reality Linear regression mathematically works when we use a pre-defined function to perform prediction task. Let us explore how the stuff works when Linear Regression algorithm gets trained. Iteration 1 - In the start, θ0 and θ1 values are randomly chosen. Let us suppose, θ0 = 0 and θ1 = 0. Predicted values after iteration 1 with Linear regression hypothesis. Cost Function - Error Gradient Descent - Updating θ0 value Here, j = 0 Gradient Descent - Updating θ1 value Here, j = 1 Iteration 2 – θ0 = 0.005 and θ1 = 0.02657 Predicted values after iteration 1 with Linear regression hypothesis. Now, similar to iteration no. 1 performed above we will again calculate Cost function and update θj values using Gradient Descent.We will keep on iterating until Cost function doesn’t reduce further. At that point, model achieves best θ values. Using these θ values in the model hypothesis will give the best prediction results. Comment More infoAdvertise with us Next Article Mathematical explanation for Linear Regression working M mohit gupta_omg :) Follow Improve Article Tags : Computer Subject Machine Learning python Practice Tags : Machine Learningpython Similar Reads Multiple Linear Regression Model with Normal Equation Prerequisite: NumPy Consider a data set, area (x1)rooms (x2)age (x3)price (y)2338656215274569244968972954756231768234253107485 let us consider, Here area, rooms, age are features / independent variables and price is the target / dependent variable. As we know the hypothesis for multiple linear regre 3 min read Normal Equation in Linear Regression Linear regression is a popular method for understanding how different factors (independent variables) affect an outcome (dependent variable. At its core, linear regression aims to find the best-fitting line that minimizes the error between observed data points and predicted values. One efficient met 8 min read Multiple Linear Regression with Backward Elimination Multiple Linear Regression (MLR) is a statistical technique used to model the relationship between a dependent variable and multiple independent variables. However, not all variables significantly contribute to the model. Backward Elimination technique helps in selecting only the most significant pr 5 min read Solving Linear Regression in Python Linear regression is a widely used statistical method to find the relationship between dependent variable and one or more independent variables. It is used to make predictions by finding a line that best fits the data we have. The most common approach to best fit a linear regression model is least-s 3 min read Example of Linear Regression in Real Life You might have read lot of tutorials on Linear Regression and already have the assumption - Linear Regression is not easy to Understand. We will make Linear Regression very easy for you. Let's boil down each concept and learn with help of Examples. If you have no idea what Linear regression is, this 9 min read Like