The Information Theory: C.E. Shannon, A Mathematical Theory of Communication'
The Information Theory: C.E. Shannon, A Mathematical Theory of Communication'
The Bell System Technical Journal, Vol. 27, pp. 379–423, July1948
“If the rate of Information is less
than the Channel capacity then there
exists a coding technique such that
the information can be transmitted
over it with very small probability of
error despite the presence of noise.”
A block diagram of digital
communication system
1011 110010
Message Source Pulse modulator
source coder Error control Line coder shaping
Electrical signal
Transmitter Receiver
channel
p
0 0.5 1
Entropy of a M-ary source
• There is a known
mathematical inequality
v -1
(V-1) >= log V
equality holds at V=1
• Conclusion-
“A source which generates equally likely
symbols will have maximum avg. information”
• Shannon said, “Von Neumann told me, ‘You should call it (average
information) entropy, for two reasons.
– In the first place your uncertainty function has been used in statistical
mechanics under that name, so it already has a name.
– In the second place, and more important, nobody knows what entropy
really is, so in a debate you will always have the advantage.”
http://hyperphysics.phy-astr.gsu.edu/hbase/therm/entrop.html
Coding for Memoryless source
• Generally the information source is not of
designers choice thus source coding is
done such that it appears equally likely to
the channel.
0 0 0.5 1 α