0% found this document useful (0 votes)
29 views

Depth First Search

The document discusses an experiment on the backpropagation algorithm. It provides an overview of the backpropagation algorithm, explaining that it is a supervised learning technique that minimizes a cost function using gradient descent to reduce the error between expected and actual output values when training neural networks. The backpropagation procedure is used to improve accuracy by propagating errors backward. There are three layers in a backpropagation model - input, hidden, and output layers. The main steps are: 1) the input layer receives input, 2) input is averaged over weights at each hidden layer, 3) each hidden layer processes output as "error", and 4) the algorithm moves back to hidden layers to optimize weights and reduce error.

Uploaded by

Dibya joy
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
29 views

Depth First Search

The document discusses an experiment on the backpropagation algorithm. It provides an overview of the backpropagation algorithm, explaining that it is a supervised learning technique that minimizes a cost function using gradient descent to reduce the error between expected and actual output values when training neural networks. The backpropagation procedure is used to improve accuracy by propagating errors backward. There are three layers in a backpropagation model - input, hidden, and output layers. The main steps are: 1) the input layer receives input, 2) input is averaged over weights at each hidden layer, 3) each hidden layer processes output as "error", and 4) the algorithm moves back to hidden layers to optimize weights and reduce error.

Uploaded by

Dibya joy
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 4

Heaven’s Light is Our Guide

Rajshahi University of Engineering & Technology

Department of Mechatronics Engineering (MTE)

Course No: CSE 4288


Course Name: Artificial Intelligence sessional

LAB REPORT

Submitted By: Submitted To:


Dibya Joy Paul Md. Mehedi Hasan
Roll:1708057 Lecturer
Date of Exp. :15/03/23 Dept. of Mechatronics Engineering,
Date of Sub. :10/05/23 RUET
Md. Hafiz Ahamed
Lecturer
Dept. of Mechatronics Engineering,
RUET
Experiment No: 3
Experiment Name: Study of Backpropagation Algorithm.
Objective:
1) To know about Backpropagation Algorithm .
2) To Know how to build a Backpropagation Algorithm in python.
3) To know about the importance of Backpropagation Algorithm in Deep
Learning .
Theory : For artificial neural networks, the backpropagation algorithm is a form
of supervised learning process where we can adjust the weight functions and
improve the model's accuracy. The cost function is minimized using the gradient
descent method. The mean-squared difference between the expected and actual
data is decreased. When training feed-forward neural networks with a set of data
whose classifications are known to us, this kind of procedure is typically used.
Backward propagation is also known as the spread of errors backward to increase
accuracy. The backpropagation procedure must be used whenever the forecast we
received from a neural network model differs significantly from the actual output
in order for us to gain improved accuracy.
There are mainly three layers in a backpropagation model i.e input layer, hidden
layer, and output layer. Following are the main steps of the algorithm:
 Step 1:The input layer receives the input.
 Step 2:The input is then averaged over weights.
 Step 3:Each hidden layer processes the output. Each output is referred to as
“Error” here which is actually the difference between the actual output and the
desired output.
 Step 4:In this step, the algorithm moves back to the hidden layers again to
optimize the weights and reduce the error.

You might also like