Artificial Neural Network - Hopfield Networks - Tutorialspoint
Artificial Neural Network - Hopfield Networks - Tutorialspoint
Hopfield neural network was invented by Dr. John J. Hopfield in 1982. It consists of a single layer
which contains one or more fully connected recurrent neurons. The Hopfield network is commonly
used for auto-association and optimization tasks.
nature. The network has symmetrical weights with no self-connections i.e., wij = wji and wii = 0.
Architecture
Following are some important points to keep in mind about discrete Hopfield network −
This model consists of neurons with one inverting and one non-inverting output.
The output of each neuron should be the input of other neurons but not the input of self.
Weight/connection strength is represented by wij.
Connections can be excitatory as well as inhibitory. It would be excitatory, if the output of the
neuron is same as the input, otherwise inhibitory.
Weights should be symmetrical, i.e. wij = wji
https://www.tutorialspoint.com/artificial_neural_network/artificial_neural_network_hopfield.htm 1/4
5/17/2020 Artificial Neural Network - Hopfield Networks - Tutorialspoint
The output from Y1 going to Y2, Yi and Yn have the weights w12, w1i and w1n respectively. Similarly,
other arcs have the weights on them.
Training Algorithm
During training of discrete Hopfield network, weights will be updated. As we know that we can have
the binary input vectors as well as bipolar input vectors. Hence, in both the cases, weight updates can
be done with the following relation
Case 1 − Binary input patterns
Here, s p
p = s1 p
p , s2 p
p ,..., si p
p ,..., sn p
p
P
P
w
wiijj =
= ∑
∑[[2
2ssii (
(pp)
)−− 1
1]][[2
2ssjj (
(pp)
)−− 1
1]] f
foor
rii ≠
≠ j
j
p
p==1
1
Here, s p
p = s1 p
p , s2 p
p ,..., si p
p ,..., sn p
p
P
P
w
wiijj =
= ∑
∑[[s
sii (
(pp)
)]][[s
sjj (
(pp)
)]] f
foor
rii ≠
≠ j
j
p
p==1
1
Testing Algorithm
Step 1 − Initialize the weights, which are obtained from training algorithm by using Hebbian principle.
Step 2 − Perform steps 3-9, if the activations of the network is not consolidated.
Step 3 − For each input vector X, perform steps 4-8.
Step 4 − Make initial activation of the network equal to the external input vector X as follows −
y
yii =
= x
xii f
foor
rii =
= 1
1tto
onn
https://www.tutorialspoint.com/artificial_neural_network/artificial_neural_network_hopfield.htm 2/4
5/17/2020 Artificial Neural Network - Hopfield Networks - Tutorialspoint
y
yiin
nii
=
= x
xii +
+ ∑
∑yyjjw
wjjii
j
j
Step 7 − Apply the activation as follows over the net input to calculate the output −
⎧1
⎧ 1 i
iff y
yiin
nii
>
> θ
θii
y
yii =
= ⎨
⎨yyii i
iff y
yiin =
= θ
θii
nii
⎩
⎩
0
0 i
iff y
yiin <
< θ
θii
nii
Here θ
θii is the threshold.
An energy function is defined as a function that is bonded and non-increasing function of the state of
the system.
Energy function Ef, also called Lyapunov function determines the stability of discrete Hopfield
network, and is characterized as follows −
n
n n
n n
n n
n
1
1
E
Eff
=
= −
− ∑
∑∑∑y
yii y
yjjw
wiijj −
− ∑
∑xxii y
yii +
+ ∑
∑θθii y
yii
2
2
i
i==1
1 j
j==1
1 i
i==1
1 i
i==1
1
Condition − In a stable network, whenever the state of node changes, the above energy function will
decrease.
(
(kk)
) (
(kk +
+ 1
1))
Suppose when node i has changed state from y
yi
i
to y
yi
i
then the Energy change Δ
ΔEEf
f
(
(kk+
+11)
) (
(kk)
)
Δ
ΔEEf
f
=
= E
Ef (y
f (yi
)
) −
− E
Ef (y
f (yi
)
)
i i
n
n
(
(kk)
) (
(kk+
+11)
) (
(kk)
)
=
= −
−((∑
∑wwiijj y
yi
i
+
+ x
xii −
− θ
θii )
)((y
y −
− y
yi )
)
i
i i
j
j==1
1
=
= −
−((n
ne tii )
et )ΔΔy
yii
(
(kk +
+ 1
1) (
(kk)
Here Δ
Δyyii =
= y
yi
i
)
−
− y
yi
i
)
https://www.tutorialspoint.com/artificial_neural_network/artificial_neural_network_hopfield.htm 3/4