Joint and Conditional Entropy
Joint and Conditional Entropy
H (C )
Ci , j C
Pi , j log 2 (1/ Pi , j )
M A 1 M B 1
H (C ) P
i 0 j 0
i, j log 2 (1/ Pi , j ) (A)
0 if a 0 or a 3
bj
1 if a 1 or a 2
What are H(A), H(B), and H(A,B)?
Solution of Example #2
Since we know that
M 1
1
H ( A) Pm log 2
m0 Pm
Do your self
Source Coding
The entropy of the source is the average information
carried per symbol.
C: AB
Source Coding
Since encoded sequence must eventually be
decoded, the function C must be invertible.
The set of all possible pairs <ai, aj> is called the Cartesian
product of set A with itself and is denoted by A x A.
Source Coding
The encoding process then becomes a function of two
variables and can be denoted by a mapping
C : A A B
or by a function
C (ai , a j ) b
Example 4
Let A be 4-ary memoryless source with symbol
probabilities
PA {0.5,0.3,0.15,0.05}