0% found this document useful (0 votes)
23 views

CS601 - Machine Learning - Unit 2 - Back - Propagation

Back propagation is a technique for training neural networks that calculates the gradient of the error at each layer and propagates this error backwards throughout the network to adjust weights. It works by calculating the gradient of the error function for the final layer first, then the previous layers, reusing partial calculations to determine gradients efficiently rather than calculating each layer separately.

Uploaded by

mohit jaiswal
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views

CS601 - Machine Learning - Unit 2 - Back - Propagation

Back propagation is a technique for training neural networks that calculates the gradient of the error at each layer and propagates this error backwards throughout the network to adjust weights. It works by calculating the gradient of the error function for the final layer first, then the previous layers, reusing partial calculations to determine gradients efficiently rather than calculating each layer separately.

Uploaded by

mohit jaiswal
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Introduction to Back Propagation Network

u Back propagation is a supervised learning technique for neural


networks that calculates the gradient of descent for weighting different
variables.
u It’s short for the backward propagation of errors, since the error is
computed at the output and distributed backwards throughout the
network’s layers.
u When an artificial neural network discovers an error, the algorithm
calculates the gradient of the error function, adjusted by the network’s
various weights.
u The gradient for the final layer of weights is calculated first, with the first
layer’s gradient of weights calculated last. Partial calculations of the
gradient from one layer are reused to determine the gradient for the
previous layer.
u This point of this backwards method of error checking is to more efficiently
calculate the gradient at each layer than the traditional approach of
calculating each layer’s gradient separately.
20
Introduction to Back Propagation Network

21
Back Propagation Network Algorithm

22
Back Propagation Network Algorithm

23
Back Propagation Network Algorithm

24
Back Propagation Network Algorithm

25
Back Propagation Network Algorithm

26
Back Propagation Network Algorithm

27
Back Propagation Network Algorithm

28
Example Back Propagation Network

29

You might also like