0% found this document useful (0 votes)
17 views18 pages

ML Lec-21

Uploaded by

bhargavr3103
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views18 pages

ML Lec-21

Uploaded by

bhargavr3103
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

ML

LECTURE-21
BY
Dr. Ramesh Kumar Thakur
Assistant Professor (II)
School Of Computer Engineering
v An Artificial Neural Network (ANN) models the relationship between a set of input signals and an output signal
using a model derived from our understanding of how a biological brain responds to stimuli from sensory inputs.
v Just as a brain uses a network of interconnected cells called neurons to create a massive parallel processor, ANN uses
a network of artificial neurons or nodes to solve learning problems.
v Biological motivation
v In the cell, the incoming signals are received by the cell’s dendrites through a biochemical process.
v The process allows the impulse to be weighted according to its relative importance or frequency.
v As the cell body begins accumulating the incoming signals, a threshold is reached at which the cell fires and the
output signal is transmitted via an electrochemical process down the axon.
v At the axon’s terminals, the electric signal is again processed as a chemical signal to be passed to the neighboring
neurons across a tiny gap known as a synapse.
v The MP neuron is mankind’s first simplified mathematical model of the neuron.
v This model was developed by McCulloch and Pitts in 1943.
v The MP neuron model is also known as the linear threshold gate model.
v They are widely used in proving logic functions.

v Now let’s look into the model. It has 4 basic components :


v The model takes inputs (x1, x2, …., xm)
v Applies Adder function(g) and
v Takes decision in Activation function (f)
v Gives an output Y
v When MP neurons are modeled as neural networks, they are connected by directed weighted paths in a neural network.
v When we pass the Adder function value in the Activation function of an MP neuron, there are 2 possibilities: the
neuron may fire (label 1) or it does not fire (label 0).
v The activation function is based on the threshold value.
v There is a fixed threshold for each neuron and if the net input to the neuron is greater than the threshold then
the neuron fires.
v Weights(w): It is the parameter that shows the contributing power of the input feature towards the output.
v Low weight value will have no change on the input and high weight will have a more significant change on the output.
v Adder Function(g): It is an Aggregation function that performs the sum of the product of the inputs with the
weights and gets Adder value.
v Activation Function(f): It is the mathematical function that decides whether neuron input is relevant for model
prediction or not.
v Threshold value(b): It is the value in the activation function based on which the activation function takes its decision.
v Initially, we take any value from 0 to n (where n is maximum adder value) and then iterate over it and find the total
loss of the model.
v Then we can choose the value of the threshold, such that the loss is minimum. This is the brute force method by
which we calculate the threshold value.
v From the above image :
v g(x) is the aggregated sum of all weighted inputs
v y is the final value predicted by an activation function (f).
v b is the threshold value which is calculated by the brute force method
v r
v r
v r
v Limitations of MP Neuron
v What about non-boolean (say, real) inputs?
v Do we always need to hand code the threshold?
v Are all inputs equal? What if we want to assign more importance to some inputs?
v What about functions which are not linearly separable? Say XOR function.

v Solution for limitations of MP Neuron


v Overcoming the limitations of the M-P neuron, Frank Rosenblatt, an American psychologist, proposed
the classical perception model, the mighty artificial neuron (perceptron), in 1957.
v It is more generalized computational model than the McCulloch-Pitts neuron where weights and
thresholds can be learnt over time.
v A perceptron is an artificial neuron in which the activation function is the threshold function.
v Consider an artificial neuron having x1, x2, ⋯ , xn as the input signals and w1, w2, ⋯ , wn as the
associated weights.
v Let w0 be some constant (known as bias).
v The neuron is called a perceptron if the output of the neuron is given by the following function:

v Below figure shows the schematic representation of a perceptron.


v 22
v 22
v 22
v 22
v 22
v 22
v Characteristics of Perceptron:
v Perceptron is a machine learning algorithm for supervised learning of binary classifiers.
v In Perceptron, the weight coefficient is automatically learned.
v Initially, weights are multiplied with input features, and the decision is made whether the neuron is
fired or not.
v The activation function applies a step rule to check whether the weight function is greater than zero.
v The linear decision boundary is drawn, enabling the distinction between the two linearly separable
classes +1 and -1.
v If the added sum of all input values is more than the threshold value, it must have an output signal;
otherwise, no output will be shown.

v Limitations of Perceptron Model


v The output of a perceptron can only be a binary number (0 or 1) due to the hard limit transfer
function.
v Perceptron can only be used to classify the linearly separable sets of input vectors.
v If input vectors are non-linear, it is not easy to classify them properly.

You might also like