Netwokrkk
Netwokrkk
net/publication/312086961
CITATIONS READS
5 525
1 author:
Abdullatif Baba
Kuwait College of Science and Technology (Private University)
119 PUBLICATIONS 172 CITATIONS
SEE PROFILE
All content following this page was uploaded by Abdullatif Baba on 05 January 2017.
12/15/2016 [email protected] 1
Neural Networks
Machine Learning
12/15/2016 [email protected] 2
Neural Networks
Biological Neural Network
• A neural network can be defined as a model of
reasoning based on the human brain.
• The brain consists of a densely interconnected set of
neurons.
12/15/2016 [email protected] 3
Neural Networks
Artificial Neural Nets Model The Brain
12/15/2016 [email protected] 4
Neural Networks
Primitive concepts
• An ANN consists of a number of very simple and highly interconnected
processors, also called neurons.
• The neurons are connected by weighted links passing signals from one neuron
to another. Each neuron receives a number of input signals through its
connections.
• It produces a single output signal which is transmitted through the neuron’s
outgoing connection that splits into a number of branches to transmit the same
signal (the signal is not divided among these branches in any way). The
outgoing branches terminate at the incoming connections of other neurons in
the network.
12/15/2016 [email protected] 5
Neural Networks
How does an artificial neural network ‘learn’?
• The neurons are connected by links, and each link has a numerical weight
associated with it.
• Weights are the basic means of long-term memory in ANNs. They express the
strength, or in other words importance, of each neuron input.
• A neural network ‘learns’ through repeated adjustments of these weights.
12/15/2016 [email protected] 6
Neural Networks
How to build an ANN ?
How many neurons are to be used Choose the network
How the neurons are to be connected to form a network. architecture
12/15/2016 [email protected] 7
Neural Networks
The neuron as a simple computing element
A neuron
• Receives several signals from its input links
• Computes a new activation level
• Sends it as an output signal through the output links.
The input signal can be raw data or outputs of other neurons.
The output signal can be either a final solution to the problem or an input to other
neurons.
12/15/2016 8
Neural Networks
How does the neuron determine its output?
The neuron computes the weighted sum of the input signals and compares the result
with a predefined threshold value, θ.
• If the net input is less than the threshold, the neuron output is -1.
• If the net input is greater than or equal to the threshold, the neuron becomes activated
and its output attains a value +1
In other words, the neuron uses the following transfer or activation function:
12/15/2016 [email protected] 9
Neural Networks
The activation function
The last type of activation function is called a sign function.
Therefpre; the actual output of the neuron with a sign activation function can be represented
as
Four common choices, the step, sign, linear and sigmoid functions are illustrated here:
12/15/2016 [email protected] 12
Neural Networks
The perceptron (The aim)
• The aim of the perceptron is to classify inputs : x1;x2; . . .;xn, into one of two classes
( A1 and A2). In this case the n-dimensional space is divided by a hyperplane into
two decision regions.
• The hyperplane is defined by the linearly separable function
• Let’s suppose the case of two inputs, x1 and x2, the decision boundary takes the
form of a straight line.
• Point 1, which lies above the boundary line, belongs to
class A1
• Point 2, which lies below the line belongs to class A2.
• The threshold θ can be used to shift the decision
boundary.
12/15/2016 [email protected] 13
Neural Networks
How does the perceptron learn its classification tasks ?
The initial weights and the threshold are randomly assigned, usually in the range [-0.5;0.5]Š
.
Then; small adjustments in the weights to reduce the difference between the
actual and desired outputs of the perceptron.
At iteration p,
The error is given by the difference between the actual output and the desired output
12/15/2016 [email protected] 14
Neural Networks
Training algorithm for classification tasks.
12/15/2016 [email protected] 15
Operational Amplifier Remember !
Inverting Amplifier : The operational amplifier has a very high impedance between its
input terminals; for a 741 about 2MΩ
Vin VX I1 * R1
Rin = ∞ in the ideal case
No current flows through it, no potential
difference between the point X and the earth.
Therefore, X is a virtual earth potential, Thus Rin = ∞
VX = 0
So : V I *R (1)
in 1 1
And :
I1 will be the same in R2 thus :
VX Vout I1 * R2
Vin
Vout I1 * R2 (2) Vout * R2
R1
Vout R2
Vin R1
Operational Amplifier Remember !
Non-Inverting Amplifier
VX I * R1 Vout
Vout I * ( R1 R2 )
R2
Rin = ∞ in the ideal case Vin ∞
Rin VX
No current flows through
I
R1
it, no potential difference
between the point X and
Vin. Thus VX = Vin
So :
Vout R1 R2 R
1 2
Vin R1 R1
Operational Amplifier Remember !
𝑅4
𝑉𝑌 = ( )𝑉𝑖𝑛+
𝑅3 +𝑅4
𝑅2 𝑅4
𝑉𝑜 = 1+ ( )𝑉𝑖𝑛+
𝑅1 𝑅3 + 𝑅4
𝑉𝑜 𝑅2 𝑅4
= 1+ ( )
𝑉𝑖𝑛+ 𝑅1 𝑅3 +𝑅4
Neural Networks
Electronic neuron model design using a saturable operational amplifier
12/15/2016 [email protected] 20
Neural Networks
Electronic neuron model design using a saturable operational amplifier
Example:
12/15/2016 [email protected] 21
Neural Networks
Electronic neuron model design using a saturable operational amplifier
Example:
Activation
𝑆 = 𝑤𝑖 − 𝑥𝑖 − + 𝑤𝑖 + 𝑥𝑖 +
function
Typically: F1 = 𝑈𝑠 + − 1;
F2 = 𝑈𝑠 − + 1
The activation function of op-amp 𝑈𝑠 + =15 V & 𝑈𝑠 − = -15 V
12/15/2016 [email protected] 22
Neural Networks
Example (train a perceptron to perform basic logical operations)
The operation AND
Suppose the threshold: θ= 0.2; and the learning rate: α= 0.1.
• The perceptron must be trained to classify the input patterns.
• The perceptron is activated by the sequence of four input patterns
representing an epoch (where, step activation function is used)
• The perceptron weights are updated after each activation.
• This process is repeated until all the weights converge to a uniform set of
values.
12/15/2016 [email protected] 23
Neural Networks
Example (train a perceptron to perform basic logical operations)
AND operation
𝑤1 0 = 0.3
𝑤2 0 = −0.1
12/15/2016 [email protected]
Neural Networks
Example (train a perceptron to perform basic logical operations)
In a similar manner, the perceptron can learn the operation OR.
A single-layer perceptron cannot be trained to perform the operation Exclusive-
OR. The following geometry represents the AND, OR and Exclusive-OR
functions as two-dimensional plots based on the values of the two inputs.
• Experimental neural networks may have five or even six layers, including
three or four hidden layers, and utilise millions of neurons.
• Most practical applications use only three layers, because each additional
layer increases the computational burden exponentially.
12/15/2016 [email protected] 27
Neural Networks
Back-propagation neural network
The learning algorithm has two phases: • First, a training input pattern is
presented to the network input layer.
The network then propagates the
input pattern from layer to layer
until the output pattern is generated
by the output layer.
• If this pattern is different from the
desired output, an error is calculated
and then propagated backwards
through the network from the output
layer to the input layer. The weights
are modified as the error is
propagated.
12/15/2016 [email protected] 28
Neural Networks
The learning law used in the back-propagation networks
12/15/2016 [email protected] 29
Neural Networks
Forward-propagation phase
Every neuron in each layer is connected to every other neuron in the adjacent
forward layer. Each neuron computes the net weighted input as :
12/15/2016 [email protected] 30
Neural Networks
The learning law used in the back-propagation networks
To propagate error signals, we start at the output
layer and work backwards to the hidden layer
Where :
12/15/2016 32
Neural Networks
The learning law used in the back-propagation networks
Updating weights at the hidden layer
n is the number of
neurons in the input layer
Neural Networks
Example
• Consider the three-layer back-
propagation network. Suppose that the
network is required to perform logical
operation Exclusive OR.
• Neurons 1 and 2 in the input layer
accept inputs x1 and x2, respectively,
and redistribute these inputs to the
neurons in the hidden layer without
any processing
• The effect of the threshold applied to a
neuron in the hidden or output layer is
represented by its weight, θ, connected
to a fixed input equal to -1.
12/15/2016 [email protected] 34
Neural Networks
Example
The initial weights and threshold levels are set randomly as follows:
Consider a training set where inputs x1 and x2 are equal to 1 and desired output yd,5 is 0.
12/15/2016 35
Neural Networks
Example
The actual outputs of neurons 3 and 4 in the hidden layer are calculated as
First, we calculate the error gradient for neuron 5 in the output layer:
Then we determine the weight corrections assuming that the learning rate
parameter, α = 0.1:
12/15/2016 [email protected] 37
Neural Networks
Example
Next we calculate the error gradients for neurons 3 and 4 in the hidden layer:
12/15/2016 [email protected] 38
Neural Networks
Example
At last, we update all weights and threshold levels in our network:
12/15/2016 [email protected] 39
Neural Networks
Example
The following set of final weights and threshold levels satisfied the chosen error
criterion:
12/15/2016 [email protected] 40
Neural Networks
Example
The training process is repeated until the sum of squared errors is less than 0.001.
12/15/2016 [email protected] 41
Neural Networks
Example
The network in the following figure is also trained to perform the Exclusive-OR
operation (Haykin, 2008).
The positions of the decision boundaries constructed by neurons 3 and 4 in the hidden layer
are shown in (a) and (b), respectively. Neuron 5 in the output layer performs a linear
combination of the decision boundaries formed by the two hidden neurons, as shown in
Figure (c).
12/15/2016 [email protected] 42
The back-propagation training algorithm
The back-propagation training algorithm
The back-propagation training algorithm
Neural Networks
What is a Bias in Neural Networks?
A bias value allows to shift the activation function to the left or right, which
may be critical for successful learning
Having a weight of -5
for w1 shifts the curve to
the right, which allows
us to have a network that
outputs 0 when x is 2.
12/15/2016 [email protected] 47
Neural Networks
Homework
Determine all the weights and thresholds for the following network using
backpropagation training algorithm, only for one next iteration
Given inputs are 0.05 and 0.10, we want the neural network to output 0.01 and 0.99.
Suppose α = 0.1 -1 -1
θh1=0.3 θO1 = 0.1
θh2=0.9
θO2= - 0.2
-1 -1
12/15/2016 [email protected] 48
Neural Networks
Train ANN using MATLAB
https://youtu.be/2Z4959acjKs
4 Minutes video
12/15/2016 [email protected] 49
View publication stats