Exp 3
Exp 3
Theory:Backpropagation Algorithm
Backpropagation is an algorithm that backpropagates the errors from the output nodes to the
input nodes. Therefore, it is simply referred to as the backward propagation of errors. It uses
in the vast applications of neural networks in data mining like Character recognition,
Signature verification, etc.
Neural Network:
Neural networks are an information processing paradigm inspired by the human nervous
system. Just like in the human nervous system, we have biological neurons in the same way in
neural networks we have artificial neurons, artificial neurons are mathematical functions
derived from biological neurons. The human brain is estimated to have about 10 billion
neurons, each connected to an average of 10,000 other neurons. Each neuron receives a signal
through a synapse, which controls the effect of the signconcerning on the neuron.
Backpropagation:
Backpropagation is a widely used algorithm for training feedforward neural networks. It
computes the gradient of the loss function with respect to the network weights. It is very
efficient, rather than naively directly computing the gradient concerning each weight. This
efficiency makes it possible to use gradient methods to train multi-layer networks and update
weights to minimize loss; variants such as gradient descent or stochastic gradient descent are
often used.
The backpropagation algorithm works by computing the gradient of the loss function with
respect to each weight via the chain rule, computing the gradient layer by layer, and iterating
backward from the last layer to avoid redundant computation of intermediate terms in the
chain rule.
Features of Backpropagation:
1. it is the gradient descent method as used in the case of simple perceptron network with the
differentiable unit.
2. it is different from other networks in respect to the process by which the weights are
calculated during the learning period of the network.
3. training is done in the three stages :
the feed-forward of input training pattern
the calculation and backpropagation of the error
updation of the weight
Working of Backpropagation:
Neural networks use supervised learning to generate output vectors from input vectors that the
network operates on. It Compares generated output to the desired output and generates an
error report if the result does not match the generated output vector. Then it adjusts the
weights according to the bug report to get your desired output.
Backpropagation Algorithm:
Advantages:
Disadvantages:
It is sensitive to noisy data and irregularities. Noisy data can lead to inaccurate results.
Performance is highly dependent on input data.
Spending too much time training.
The matrix-based approach is preferred over a mini-batch.
CODE
# Initialize a network
network = list()
network.append(hidden_layer)
network.append(output_layer)
return network
activation = weights[-1]
return activation
def transfer(activation):
inputs = row
for layer in network:
new_inputs = []
neuron['output'] = transfer(activation)
new_inputs.append(neuron['output'])
inputs = new_inputs
return inputs
def transfer_derivative(output):
for i in reversed(range(len(network))):
layer = network[i]
errors = list()
if i != len(network) - 1:
for j in range(len(layer)):
error = 0.0
errors.append(error)
else:
for j in range(len(layer)):
neuron = layer[j]
errors.append(neuron['output'] - expected[j])
for j in range(len(layer)):
neuron = layer[j]
for i in range(len(network)):
inputs = row[:-1]
if i != 0:
for j in range(len(inputs)):
sum_error = 0
expected[int(row[-1])] = 1
seed(1)
n_inputs = len(dataset[0]) - 1
print(layer)
OUTPUT