Implementation of Perceptron Algorithm for OR Logic Gate with 2-bit Binary Input
Last Updated :
08 Jun, 2020
In the field of Machine Learning, the Perceptron is a Supervised Learning Algorithm for binary classifiers. The Perceptron Model implements the following function:
\[
\begin{array}{c}
\hat{y}=\Theta\left(w_{1} x_{1}+w_{2} x_{2}+\ldots+w_{n} x_{n}+b\right) \\
=\Theta(\mathbf{w} \cdot \mathbf{x}+b) \\
\text { where } \Theta(v)=\left\{\begin{array}{cc}
1 & \text { if } v \geqslant 0 \\
0 & \text { otherwise }
\end{array}\right.
\end{array}
\]
For a particular choice of the weight vector
$\boldsymbol{w}$ and bias parameter
$\boldsymbol{b}$, the model predicts output
$\boldsymbol{\hat{y}}$ for the corresponding input vector
$\boldsymbol{x}$.
OR logical function truth table for
2-bit binary variables, i.e, the input vector
$\boldsymbol{x} : (\boldsymbol{x_{1}}, \boldsymbol{x_{2}})$ and the corresponding output
$\boldsymbol{y}$ -
$\boldsymbol{x_{1}}$ |
$\boldsymbol{x_{2}}$ |
$\boldsymbol{y}$ |
0 |
0 |
0 |
0 |
1 |
1 |
1 |
0 |
1 |
1 |
1 |
1 |
Now for the corresponding weight vector
$\boldsymbol{w} : (\boldsymbol{w_{1}}, \boldsymbol{w_{2}})$ of the input vector
$\boldsymbol{x} : (\boldsymbol{x_{1}}, \boldsymbol{x_{2}})$, the associated Perceptron Function can be defined as:
\[$\boldsymbol{\hat{y}} = \Theta\left(w_{1} x_{1}+w_{2} x_{2}+b\right)$\]

For the implementation, considered weight parameters are
$\boldsymbol{w_{1}} = 1, \boldsymbol{w_{2}} = 1$ and the bias parameter is
$\boldsymbol{b} = -0.5$.
Python Implementation:
Python3 1==
# importing Python library
import numpy as np
# define Unit Step Function
def unitStep(v):
if v >= 0:
return 1
else:
return 0
# design Perceptron Model
def perceptronModel(x, w, b):
v = np.dot(w, x) + b
y = unitStep(v)
return y
# OR Logic Function
# w1 = 1, w2 = 1, b = -0.5
def OR_logicFunction(x):
w = np.array([1, 1])
b = -0.5
return perceptronModel(x, w, b)
# testing the Perceptron Model
test1 = np.array([0, 1])
test2 = np.array([1, 1])
test3 = np.array([0, 0])
test4 = np.array([1, 0])
print("OR({}, {}) = {}".format(0, 1, OR_logicFunction(test1)))
print("OR({}, {}) = {}".format(1, 1, OR_logicFunction(test2)))
print("OR({}, {}) = {}".format(0, 0, OR_logicFunction(test3)))
print("OR({}, {}) = {}".format(1, 0, OR_logicFunction(test4)))
Output:
OR(0, 1) = 1
OR(1, 1) = 1
OR(0, 0) = 0
OR(1, 0) = 1
Here, the model predicted output (
$\boldsymbol{\hat{y}}$) for each of the test inputs are exactly matched with the OR logic gate conventional output (
$\boldsymbol{y}$) according to the truth table for 2-bit binary input.
Hence, it is verified that the perceptron algorithm for OR logic gate is correctly implemented.