Information Theory and Source Coding
Information Theory and Source Coding
What is information
VLC2
Letter Codeword Letter Codeword
A 0 E 10
B 1 F 11
C 00 G 000
D 01 H 111
Source Coding (Example)
“A BAD CAB”
FLC: 000 001 000 011 010 000 001 (bits=21)
VLC 1: 00 010 00 100 011 00 010 (bits=18)
VLC 2: 0 1001 0001 (bits=9)
Huffman Coding (VLC with symbols not
equally probable)
Steps for Huffman Coding
1)Arrange the source symbols in a decreasing order of their probabilities.
2) Take the bottom two symbols and tie them together, add the probabilities of the
two symbols and write it on the combined node. Label the two branches with ‘1’ and
‘0’.
3) Treat this sum of probabilities as a new probability associated with a new symbol.
Again repeat step 2.
4) Continue the steps until only one probability left (and should be 1). This completes
the construction of the Huffman code tree.
5) To find out the prefix codeword for any symbol, follow the branches from the final
node- back to the symbol. This gives the codeword for the symbol.
CYCLIC CODES