NN-2nd
NN-2nd
Unit – 2nd
McCulloch and Pits Neural Network (MCP Model):
Architecture, Solution of AND, OR function using MCP model,
Hebb Model: Architecture, training and testing, Hebb network
for AND function. Perceptron Network: Architecture, training,
Testing, single and multi-output model, Perceptron for AND
function Linear function, application of linear model, linear
seperatablity, solution of OR function using liner seperatablity
model.
McCulloch and Pits Neural Network:
The first computational model of a neuron was proposed by
Warren MuCulloch (neuroscientist) and Walter Pitts (logician)
in 1943.
Architecture: The motivation behind the McCulloh Pitt’s
Model is a biological neuron. A biological neuron takes an
input signal from the dendrites and after processing it passes
onto other connected neurons as the output if the signal is
received positively, through axons and synapses. This is the
basic working of a biological neuron which is interpreted and
mimicked using the McCulloh Pitt’s Model.
McCulloch Pitt’s model of neuron is a fairly simple model
which consists of some (n) binary inputs with some weight
associated with each one of them. An input is known as
‘inhibitory input’ if the weight associated with the input is of
negative magnitude and is known as ‘excitatory input’ if the
weight associated with the input is of positive magnitude. As
the inputs are binary, they can take either of the 2 values, 0 or
1.
If X ≥ ø(threshold value)
Output = 1
Else Output = 0
Geometric Interpretation of McCulloh & Pitt’s Model
OR Function
We know that the thresholding parameter for OR function is 1,
i.e. theta is 1. The possible combinations of inputs are: (0,0),
(0,1), (1,0), and (1,1). Considering the OR function’s
aggregation equation, i.e. x_1+x_2≥1, let us plot the graph.
The graph shows that the inputs for which the output when
passed through OR function M-P neuron lie ON or ABOVE
(output is 1, positive) that line and all inputs that lie BELOW
(output is 0, negative) that line give the output as 0.
Therefore, the McCulloh Pitt’s Model has made a linear
decision boundary which splits the inputs into two classes,
which are positive and negative.
AND Function
Similar to OR Function, we can plot the graph for AND function
considering the equation is x_1+x_2=2.
In this case, the decision boundary equation is x_1 + x_2 =2.
Here, all the input points that lie ON or ABOVE, just (1,1),
output 1 when passed through the AND function M-P neuron.
It fits! The decision boundary works!
4. Training Iteration 2:
Final Weights and Bias:
After training, the final weights and bias are w₁ = 2 , w₂ =
1, and b=−1.
Testing the Perceptron for AND Function:
Characteristics of Perceptron
1. Perceptron is a machine learning algorithm for supervised
learning of binary classifiers.
2. In Perceptron, the weight coefficient is automatically
learned.
3. Initially, weights are multiplied with input features, and the
decision is made whether the neuron is fired or not.
4. The activation function applies a step rule to check whether
the weight function is greater than zero.
5. The linear decision boundary is drawn, enabling the
distinction between the two linearly separable classes +1
and -1.
6. If the added sum of all input values is more than the
threshold value, it must have an output signal; otherwise,
no output will be shown.
A perceptron model has limitations as follows:
o The output of a perceptron can only be a binary number (0
or 1) due to the hard limit transfer function.
o Perceptron can only be used to classify the linearly
separable sets of input vectors. If input vectors are non-
linear, it is not easy to classify them properly.