lect03
lect03
Xn M
Encoder
̂ n, X
X ̂n
Decoder
Xn M
Encoder
/
Bounds on the optimal rate region
R
Outer bound
Inner bound
H(X )
?
H(X |X )
R
H(X |X ) H(X )
/
Slepian–Wolf theorem
R
H(X )
H(X |X )
R
H(X |X ) H(X )
R ≥ H(X |X ),
R ≥ H(X |X ),
R + R ≥ H(X , X )
/
Example
∙ Let p = .
∙ Individual compression: bits/symbol-pair
∙ Slepian–Wolf coding: H(X , X ) = . bits/symbol-pair
/
∙ Codebook generation:
Randomly assign an index m(xn ) ∈ [ : nR ] to each sequence xn ∈ X n
The set of sequences with the same index m form a bin B(m), m ∈ [ : nR ]
Bin assignments are revealed to the encoder and decoder
∙ Encoding:
Upon observing xn ∈ B(m), send the bin index m
∙ Decoding:
Find the unique typical sequence x̂ n ∈ B(m)
xn ∈ Tє(n)
/
Lossless source coding via random binning
∙ Codebook generation:
Randomly assign an index m(xn ) ∈ [ : nR ] to each sequence xn ∈ X n
The set of sequences with the same index m form a bin B(m), m ∈ [ : nR ]
Bin assignments are revealed to the encoder and decoder
∙ Encoding:
Upon observing xn ∈ B(m), send the bin index m
∙ Decoding:
Find the unique typical sequence x̂ n ∈ B(m)
xn
/
∙ Codebook generation:
Randomly assign an index m(xn ) ∈ [ : nR ] to each sequence xn ∈ X n
The set of sequences with the same index m form a bin B(m), m ∈ [ : nR ]
Bin assignments are revealed to the encoder and decoder
∙ Encoding:
Upon observing xn ∈ B(m), send the bin index m
∙ Decoding:
Find the unique typical sequence x̂ n ∈ B(m)
x̂ n
/
Analysis of the probability of error
Thus
/
∙ Consider
P(E |X n ∈ B()) = PX n = xn | X n ∈ B()
xn
⋅ P̃xn ∈ B() for some x̃ n ̸= xn , x̃ n ∈ Tє(n) | xn ∈ B(), X n = xn
≤ p(xn ) P̃xn ∈ B() | xn ∈ B(), X n = xn
xn x̃ n ∈T(n)
x̃ n ̸=x n
≤ |Tє(n) | ⋅ −nR
≤ n(H(X)+δ(є)) −nR
/
Achievability via linear binning
/
Time sharing
Proposition .
If (R , R ), (R , R ) ∈ R ∗ , then (R , R ) = (αR + αR
̄ , αR + αR
̄ ) ∈ R ∗ for α ∈ [, ]
(R , R )
(R , R )
(R , R )
R
/
Time sharing
Proposition .
If (R , R ), (R , R ) ∈ R ∗ , then (R , R ) = (αR + αR
̄ , αR + αR
̄ ) ∈ R ∗ for α ∈ [, ]
(αn) ̄
(αn)
By the union of events bound, Pe(n) ≤ Pe + Pe →
∙ Remarks:
Time division is a special case of time sharing (between (R , ) and (, R ))
The rate (capacity) region of any source (channel) coding problem is convex
/
R
H(X )
H(X |X )
R
H(X |X ) H(X )
/
Achievability proof of the S–W theorem (Cover )
∙ We show that the corner point (H(X ), H(X |X )) is achievable
∙ Achievability of the other corner point (H(X |X )), H(X )) follows similarly
∙ The rest of the region is achieved using time sharing
R
H(X )
H(X |X )
R
H(X |X ) H(X )
/
R
H(X )
H(X |X )
R
H(X |X ) H(X )
/
Achievability of (H(X ), H(X |X ))
∙ Codebook generation:
Assign a distinct index m ∈ [ : nR ] to each xn ∈ Tє(n) (X ), and m = , otherwise
Randomly assign an index m (xn ) ∈ [ : nR ] to each xn ∈ Xn
The sequences with the same index m form a bin B(m )
xn
m nR
xn
B()
B(nR )
/
xn
m nR
xn
B()
B(nR )
/
Achievability of (H(X ), H(X |X ))
∙ Decoding: Sources recovered successively
Declare x̂ n = xn (m ) for the unique xn (m ) ∈ Tє(n) (X )
Find the unique x̂ n ∈ B(m ) ∩ Tє(n) (X |̂xn )
If there is none or more than one, the decoder declares an error
xn
m nR
xn
B()
B(nR )
/
∙ Let M and M denote the random bin indices for Xn and Xn
∙ Error events:
E = (Xn , Xn ) ∉ Tє(n) ,
E = ̃xn ∈ B(M ) for some x̃ n ̸= Xn , (Xn , x̃ n ) ∈ Tє(n)
Then,
∙ P(E ) → by LLN
/
Analysis of the probability of error
∙ By symmetry,
P(E ) = P(E |Xn ∈ B())
= P{(Xn , Xn ) = (xn , xn ) | Xn ∈ B()}
(x n ,x n )
⋅ P̃xn ∈ B() for some x̃ n ̸= xn , (xn , x̃ n ) ∈ Tє(n) xn ∈ B(),
(Xn , Xn ) = (xn , xn )
≤ p(xn , xn ) P{̃xn ∈ B()}
(x n ,x n ) x̃ n ∈T(n) (X |x n )
x̃ n ̸=x n
/
Theorem .
The optimal rate region R ∗ (X , . . . , Xk ) for the k-DMS (X , . . . , Xk ) is the set of
(R , . . . , Rk ) such that
/
Summary
∙ Time sharing
/
References
Cover, T. M. (). A proof of the data compression theorem of Slepian and Wolf for ergodic sources.
IEEE Trans. Inf. Theory, (), –.
Slepian, D. and Wolf, J. K. (). Noiseless coding of correlated information sources. IEEE Trans. Inf.
Theory, (), –.
/