Hopfield Networks and Boltzman Machines-Part 2
Hopfield Networks and Boltzman Machines-Part 2
They are also called stochastic Hopfield network with hidden units. Hidden units to
enables the learning of higher order relationships and abstractions. As for Hopfield
networks, we have symmetric connections between units.
Finding an optimal state involves relaxation: letting the network settle into a
configuration that maximizes a goodness function. This is done by annealing.
Annealing in metals
Step 1: Initialize Start with a random initial configuration. Initialize a very high “temperature”.
Step 3: Calculate score Calculate Score (Delta E in or case) due to the move made.
Step 4: Choose Depending on score (Delta E), accept or reject the move. The
probability of acceptance depending on the current “temperature”.
Step 5: Update and repeat Update the temperature value by lowering the temperature.
Greedy Algorithm
gets stuck here!
Locally Optimum
Solution.
- a pair of nodes from each of the two groups of units commonly referred
to as the "visible" and "hidden" units respectively, may have a symmetric
connection between them
- there are no connections between nodes within a group.
Two associations, A1:B1 and A2:B2. These are then transformed into the bipolar forms
A1 = (1, 0, 1, 0, 1, 0), B1 = (1, 1, 0, 0) X1 = (1, -1, 1, -1, 1, -1), Y1 = (1, 1, -1, -1)
A2 = (1, 1, 1, 0, 0, 0), B2 = (1, 0, 1, 0) X2 = (1, 1, 1, -1, -1, -1), Y2 = (1, -1, 1, -1)
T T
A matrix M can be calculated as X Y where X is the transpose of X.
1 * 1 1 -1 -1 1 * 1 -1 1 -1
-1 1
1 1
-1 + -1 =>
1 -1
-1 -1
If (4, 2, -2, -4) is run through a threshold => (1, 1, 0, 0), which is B1.
Convolutional Networks