Implementation of Perceptron Algorithm for XOR Logic Gate with 2-bit Binary Input
Last Updated :
22 Dec, 2022
In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function: \[ \begin{array}{c} \hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\ =\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\ \text { where } \Theta(v)=\left\{\begin{array}{cc} 1 & \text { if } v \geqslant 0 \\ 0 & \text { otherwise } \end{array}\right. \end{array} \] For a particular choice of the weight vector $\boldsymbol{w}$ and bias parameter $\boldsymbol{b}$ , the model predicts output $\boldsymbol{\hat{y}}$ for the corresponding input vector $\boldsymbol{x}$ . XOR logical function truth table for 2-bit binary variables, i.e, the input vector $\boldsymbol{x} : (\boldsymbol{x_{1}}, \boldsymbol{x_{2}})$ and the corresponding output $\boldsymbol{y}$ -
$\boldsymbol{x_{1}}$ | $\boldsymbol{x_{2}}$ | $\boldsymbol{y}$ |
---|
0 | 0 | 0 |
0 | 1 | 1 |
1 | 0 | 1 |
1 | 1 | 0 |
We can observe that, $XOR(\boldsymbol{x_{1}}, \boldsymbol{x_{2}}) = AND(NOT(AND(\boldsymbol{x_{1}}, \boldsymbol{x_{2}})), OR(\boldsymbol{x_{1}}, \boldsymbol{x_{2}}))$ Designing the Perceptron Network:
- Step1: Now for the corresponding weight vector $\boldsymbol{w} : (\boldsymbol{w_{1}}, \boldsymbol{w_{2}})$ of the input vector $\boldsymbol{x} : (\boldsymbol{x_{1}}, \boldsymbol{x_{2}})$ to the AND and OR node, the associated Perceptron Function can be defined as: \[$\boldsymbol{\hat{y}_{1}} = \Theta\left(w_{1} x_{1}+w_{2} x_{2}+b_{AND}\right)$ \] \[$\boldsymbol{\hat{y}_{2}} = \Theta\left(w_{1} x_{1}+w_{2} x_{2}+b_{OR}\right)$ \]
- Step2: The output ($\boldsymbol{\hat{y}}_{1}$) from the AND node will be inputted to the NOT node with weight $\boldsymbol{w_{NOT}}$ and the associated Perceptron Function can be defined as: \[$\boldsymbol{\hat{y}_{3}} = \Theta\left(w_{NOT} \boldsymbol{\hat{y}_{1}}+b_{NOT}\right)$\]
- Step3: The output ($\boldsymbol{\hat{y}}_{2}$) from the OR node and the output ($\boldsymbol{\hat{y}}_{3}$) from NOT node as mentioned in Step2 will be inputted to the AND node with weight $(\boldsymbol{w_{AND1}}, \boldsymbol{w_{AND2}})$ . Then the corresponding output $\boldsymbol{\hat{y}}$ is the final output of the XOR logic function. The associated Perceptron Function can be defined as: \[$\boldsymbol{\hat{y}} = \Theta\left(w_{AND1} \boldsymbol{\hat{y}_{3}}+w_{AND2} \boldsymbol{\hat{y}_{2}}+b_{AND}\right)$\]
For the implementation, the weight parameters are considered to be $\boldsymbol{w_{1}} = 1, \boldsymbol{w_{2}} = 1, \boldsymbol{w_{NOT}} = -1, \boldsymbol{w_{AND1}} = 1, \boldsymbol{w_{AND2}} = 1$ and the bias parameters are $\boldsymbol{b_{AND}} = -1.5, \boldsymbol{b_{OR}} = -0.5, \boldsymbol{b_{NOT}} = 0.5$ . Python Implementation:
Python3
# importing Python library
import numpy as np
# define Unit Step Function
def unitStep(v):
if v >= 0:
return 1
else:
return 0
# design Perceptron Model
def perceptronModel(x, w, b):
v = np.dot(w, x) + b
y = unitStep(v)
return y
# NOT Logic Function
# wNOT = -1, bNOT = 0.5
def NOT_logicFunction(x):
wNOT = -1
bNOT = 0.5
return perceptronModel(x, wNOT, bNOT)
# AND Logic Function
# here w1 = wAND1 = 1,
# w2 = wAND2 = 1, bAND = -1.5
def AND_logicFunction(x):
w = np.array([1, 1])
bAND = -1.5
return perceptronModel(x, w, bAND)
# OR Logic Function
# w1 = 1, w2 = 1, bOR = -0.5
def OR_logicFunction(x):
w = np.array([1, 1])
bOR = -0.5
return perceptronModel(x, w, bOR)
# XOR Logic Function
# with AND, OR and NOT
# function calls in sequence
def XOR_logicFunction(x):
y1 = AND_logicFunction(x)
y2 = OR_logicFunction(x)
y3 = NOT_logicFunction(y1)
final_x = np.array([y2, y3])
finalOutput = AND_logicFunction(final_x)
return finalOutput
# testing the Perceptron Model
test1 = np.array([0, 1])
test2 = np.array([1, 1])
test3 = np.array([0, 0])
test4 = np.array([1, 0])
print("XOR({}, {}) = {}".format(0, 1, XOR_logicFunction(test1)))
print("XOR({}, {}) = {}".format(1, 1, XOR_logicFunction(test2)))
print("XOR({}, {}) = {}".format(0, 0, XOR_logicFunction(test3)))
print("XOR({}, {}) = {}".format(1, 0, XOR_logicFunction(test4)))
Output:XOR(0, 1) = 1
XOR(1, 1) = 0
XOR(0, 0) = 0
XOR(1, 0) = 1
Here, the model predicted output ($\boldsymbol{\hat{y}}$ ) for each of the test inputs are exactly matched with the XOR logic gate conventional output ($\boldsymbol{y}$ ) according to the truth table. Hence, it is verified that the perceptron algorithm for XOR logic gate is correctly implemented.