PTSP-VI-Part-2
PTSP-VI-Part-2
1. Huffman Code.
3. Lemple-Ziv Code.
4. Fano Code.
5. Shannon Code .
6. Arithmetic Code.
Huffman Code
1. Huffman Code.
ADVANTAGES:
• uniquely decodable code
• smallest average codeword
length
DISADVANTAGES:
• LARGE tables give complexity
• sensitive to channel errors
1. Huffman Code.
s1 0.2
s3 0.2
s0 0.1
s4 0.1
Solution A
Source Stage I
Stage II Symbol
sk
s2 0. 0.
4 4
0. 0.
s1 2 2
0. 0.
2 2
s3
0. 0.
1 2
s0
Solution A
•Source Stage IStage II Stage III
• s k
• s 2 0. 0. 0.
4 4 4
• S1
0. 0. 0.
• s3
2 2 4
• s0
0. 0. 0.
• s4 2 2 2
0. 0.
1 2
Solution A
•SourceStage I Stage II Stage III S Stage
IV
• s k
• s 2 0. 0. 0. 0.
4 4 4 6
• s1
0. 0. 0. 0.
• s3
2 2 4 4
• s0
0. 0. 0.
• s4 2 2 2
0. 0.
1 2
Solution A
•SourceStage I Stage II Stage III Stage
IV
• s k
• s 2 0. 0.4 0. 0.6
0
4 4
• s1 0.2
0
0. 1
0. 0. 4
• s3 0
2 0.2 4 1
0
• s0 0.
0. 1 0.
2 2 2
• s4 1
0.
1
Solution A
•SourceStage I Stage II Stage III Stage Cod
IV e
• s k
• s 2 0. 0.4 0. 0.
0
00
4 4 6
• s1 0.2
0
10
1
0. 0. 0.
• s3 0 11
2 0.2 4 1 4
0
• s0 0. 01
0. 1 0.
2 2 2 0
• s4 1
0. 01
1 1
Solution A
Sourc
e
Cont’d
Symbol Code
Probabilit word H S 2.12193
Symb y pk ck
ol sk
s0 0.1 010 L 0.4 2 0.2 2
s1 0.2 10 0.2 2 0.1 3
s2 0.4 00 0.1 3
s3 0.2 11 2.2
s4 0.1 011
H S L H S
THIS IS NOT THE ONLY 1
SOLUTION!
Another Solution B
Source Stage I Stage II Stage III Cod
Stage IV Symbol e
sk
s2 0. 0. 0. 0.6
0
1
4 4 4 0
s1 0. 1 01
0. 0. 0 0. 4
s3 2 2 4 1 000
0
s0 0. 0. 1 0. 0010
2 2 2
s4 1
0011
0. 0.
1 2
Another Solution B
Cont’d
Sourc Symbol Code
e Probabilit word H S 2.12193
Symb y pk ck
ol sk
s0 0.1 0010 L 0.4 1 0.2 2
s1 0.2 01 0.2 3 0.1 4
s2 0.4 1 0.1 4
s3 0.2 000 2.2
s4 0.1 0011
H S L H S
1
What is the difference
between the two
solutions?
• They have the same average length
• They differ in the
K 1variance of 2the average
l L
code length 2
s p
k 0
k k
• Solution
A
• s 2=0.16
• Solution
B
• s 2=1.36
Shannon – Fano Encoding:
First the messages are ranked in a table in descending order of probability. The table is then progressively
divided into subsections of probability near equal as possible. Binary 0 is attributed to the upper subsection
and binary 1 to the lower subsection. This process continues until it is impossible to divide any further, the
2-Divide the table into as near as possible two equal values of probability.
3-Allocate binary 0 to the upper section and binary 1 to the lower section.
4-Divide both the upper section and the lower section into two.
5-Allocate binary 0 to the top half of each section and binary 1 to the lower half.
6-Repeat steps (4) and (5) until it is not possible to go any further.
For this example we can evaluate the efficiency of this system:
L = ? digits / symbol
H = ? bits / symbol
η = (H / L) *100% = ?
Example 2:
Mutual Information
Source symbols from some finite alphabet are mapped into some sequence of channel
symbols, which then produces the output sequence of the channel. The output
sequence is random but has a distribution that depends on the input sequence.
Discrete Channel Capacity
Channel Capacity
• Each of the possible input sequences induces a probability
Shannon–Hartley
Theorem
R=Information Rate
C=Channel Capacity
Trade-off between bandwidth and SNR
To increase the information rate the trade between bandwith and signal noise ratio
are held.
Definition of Bandwith;
• The maximum number of data is transferred over the internet. It is a part
of information technology.
Definition of signal noise ratio;
• Signal noise ratio compared the desired signal to the level of background noise.
Trade between bandwith and signal noise ratio;
• To occupy the smallest system the bandwith and signal noise ratio work against
each other.
• Bandwith has the fixed quantity and niether can be changed. So the channel
capacity is directly proportional to the signal noise ratio.
• While the channel capacity is changed by the signal noise ratio to occupy the
smallest system or the allocated bandwith.
• The noise of the bandwith increases and we can observe the signal noise relation
also increased in the background.
• Hence, in relation to bandwith and signal noise relation they work
simultaneously in reference to the channel capacity directly proportional to the
signal to noise ratio.
Trade-off between bandwidth and SNR
The message rate is given as
Trade-off between bandwidth and SNR
**This gives the maximum information transmission rate possible for a system of
given power but no bandwidth limitations