Backpropagation in Neural Network - GeeksforGeeks
Backpropagation in Neural Network - GeeksforGeeks
Table of Content
What is Backpropagation?
Working of Backpropagation Algorithm
Example of Backpropagation in Machine Learning
Backpropagation Implementation in Python for XOR Problem
Advantages of Backpropagation for Neural Network Training
Challenges with Backpropagation
What is Backpropagation?
Backpropagation is a powerful algorithm in deep learning, primarily
used to train artificial neural networks, particularly feed-forward
https://www.geeksforgeeks.org/backpropagation-in-neural-network/ 1/18
11/12/24, 8:20 PM Backpropagation in Neural Network - GeeksforGeeks
https://www.geeksforgeeks.org/backpropagation-in-neural-network/ 2/18
11/12/24, 8:20 PM Backpropagation in Neural Network - GeeksforGeeks
In the forward pass, the input data is fed into the input layer. These
inputs, combined with their respective weights, are passed to hidden
layers.
For example, in a network with two hidden layers (h1 and h2 as shown
in Fig. (a)), the output from h1 serves as the input to h2. Before applying
an activation function, a bias is added to the weighted inputs.
https://www.geeksforgeeks.org/backpropagation-in-neural-network/ 3/18
11/12/24, 8:20 PM Backpropagation in Neural Network - GeeksforGeeks
In the backward pass, the error (the difference between the predicted
and actual output) is propagated back through the network to adjust
the weights and biases. One common method for error calculation is the
Mean Squared Error (MSE), given by:
https://www.geeksforgeeks.org/backpropagation-in-neural-network/ 4/18
11/12/24, 8:20 PM Backpropagation in Neural Network - GeeksforGeeks
Forward Propagation
1. Initial Calculation
The weighted sum at each node is calculated using:
aj = ∑(wi , j ∗ xi )
Where,
aj is the weighted sum of all the inputs and weights at each node,
wi,j represents the weights associated with the j th input to the ith
neuron,
xi represents the value of the j th input,
2. Sigmoid Function
The sigmoid function returns a value between 0 and 1, introducing non-
linearity into the model.
1
yj =
1+e−aj
https://www.geeksforgeeks.org/backpropagation-in-neural-network/ 5/18
11/12/24, 8:20 PM Backpropagation in Neural Network - GeeksforGeeks
3. Computing Outputs
At h1 node,
a1 = (w1,1 x1 ) + (w2,1 x2 )
= 0.21
1+e−a1
1
y3 = F (0.21) =
1+e−0.21
y3 = 0.56
1
y4 = F (0.315) =
1+e−0.315
1
y5 = F (0.702) =
1+e−0.702
= 0.67
https://www.geeksforgeeks.org/backpropagation-in-neural-network/ 6/18
11/12/24, 8:20 PM Backpropagation in Neural Network - GeeksforGeeks
4. Error Calculation
Note that, our actual output is 0.5 but we obtained 0.67.
Errorj = ytarget − y5
Backpropagation
1. Calculating Gradients
The change in each weight is calculated as:
Δwij = η × δj × Oj
Where:
δ5 = y5 (1 − y5 )(ytarget − y5 )
https://www.geeksforgeeks.org/backpropagation-in-neural-network/ 7/18
11/12/24, 8:20 PM Backpropagation in Neural Network - GeeksforGeeks
δ3 = y3 (1 − y3 )(w1,3 × δ5 )
For h2:
δ4 = y4 (1 − y4 )(w2,3 × δ5 )
4. Weight Updates
For the weights from hidden to output layer:
New weight:
New weight:
Statistics with Python Data Analysis Tutorial Python – Data visualization tutorial NumPy Pandas
w1,3 (new) = 0.08567
https://www.geeksforgeeks.org/backpropagation-in-neural-network/ 8/18
11/12/24, 8:20 PM Backpropagation in Neural Network - GeeksforGeeks
y3 = 0.57
y4 = 0.56
y5 = 0.61
Since y5 = 0.61 is still not the target output, the process of calculating
https://www.geeksforgeeks.org/backpropagation-in-neural-network/ 9/18
11/12/24, 8:20 PM Backpropagation in Neural Network - GeeksforGeeks
https://www.geeksforgeeks.org/backpropagation-in-neural-network/ 12/18