0% found this document useful (0 votes)
1 views

NN - lecture w9 (23.04)

Uploaded by

Amara Costachiu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views

NN - lecture w9 (23.04)

Uploaded by

Amara Costachiu
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

NN – lecture of w9

Hopfield Networks

- Also called autoassocial


- Don’t make the confusion with the autoencoder. In an autoencoder, the
input and output are the same.

Auto-encoder with signal compression?

- We take this and consider it encoder/compressor;


- We put first 1024 samples
- Result willl be 100 cuz we have 100 neurons(in mid rectangle)
- The compression rate is 10 just like mp3

How is information stored in machine learning? Or in nn in particular

In sub-simbolic form. The info is not in a simbolic form.

What is a hopfield network? It is called autoassociated netw because it


associates patterns to the network direclty, into the weights of the thing.

Up until now, what kind of netw did we have?

For the binary, it was only 2 outputs:

- For the perceptron, which were the values? 0 for false and 1 for true.
=> Unipolar
- You have also bipolar nn’s. and bipolar outputs. There, 1 is true, and -
1 is false.
Eniac -> had a full room of vacuum tubes. Each digit corresponded to a tube.
They used it to recompute the logbook for the artillery of the USA.

A hopfield nn is a netw of neurons fully connected and symmetric.

Wij = Wji and Wii = 0

How many different connections?

( n−1 ) ( n−2 )
2

Activation is the sum of the activation of the other neurons times the weight.
D
1
W = ∑ ξ iT ξ i
n i=1

Let’s take a network having 4 nodes. Reconnect the inputs

(0110)

(1001)

These are 2 patterns that we want this network to be able to remember, to


reconnect.

Compute the ws

??

X(t+1) = sign [Wx(t)]

In the hopfield nn, the activation of the neurons can take place randomly,
independently

You can compute the energy of a node using this formula:


m
−1
E ( x )=
2
xW x T ( + ∑ OI
i=1

You might also like