Lecture "Channel Coding" Chapter 2: Channel Coding Principles
Lecture "Channel Coding" Chapter 2: Channel Coding Principles
1. Transmission Model
2. Channel Models:
BSC & QSC, BEC & QEC, AWGN
3. Decoding Principles:
MAP, ML, s/s MAP
u c r ĉ
source encoder + decoder sink û
• The information symbols u0 , . . . , uk−1 are from a finite alphabet A and they are statistically independent
and equi-probable.
• The encoder maps the k information symbols on a valid codeword of length n (one-to-one mapping).
Qn−1
• We consider only additive discrete memoryless channels (DMCs): P(r|c) = i=0 P(ri |ci )
• The additive DMC summarizes filtering, modulation, demodulation, sampling and quantization (D/A and
A/D conversion).
u c r ĉ
source encoder + decoder sink û
• Due to the one-to-one mapping of u to c, once we have ĉ, we also know û at the sink.
Remark:
Vectors as a = (a0 , a1 , . . . , an−1 ) denote row vectors.
Column vectors are denoted by aT .
=⇒ MAP and ML (if P(c) = const.) decoders choose the most likely codeword and therefore minimize the
block error probability.
Drawback: large complexity (we have to go through all codewords) =⇒ not practical!
[Examples for MAP and ML decoders: see blackboard]
Antonia Wachter-Zeh (TUM) 10
Decoding Principles: Symbol-by-Symbol MAP
• The MAP decoding rule can also be applied to each symbol instead of the whole word.
• To decide for the symbol ĉi , ∀i ∈ {0, . . . , n − 1}, we decide on the most likely symbol by summing up the
probabilities of all codewords with 0 or 1 on position i.
for i = 0, . . . , n − 1.
• d(a, b) = wt(a − b)
• A code with minimum Hamming distance d is a set of words where any two codewords have Hamming
distance at least d.
c(2)
c(1) r
d −1 c(3)
c(2)
• A nearest codeword decoder decides for the ”closest”
(i.e., within smallest Hamming distance) codeword. c(1)
• If more than one codewords is at the same distance, it r
randomly decides for one.
The decoder outputs a word ĉ for which one of the following three cases holds.
• Miscorrection: The decoder output ĉ is a valid codeword, but not the channel input. This case cannot be
detected at the receiver side.
(Its probability will be denoted by Perr )
• Decoding failure: The decoder output is not a valid codeword. The receiver therefore detects that the
output is erroneous.
(Its probability will be denoted by Pfail )
d−1
j k
Practically, some errors beyond 2 might be correctable.
• Symbol error probability (codeword): Psym (ci ) = r:ĉi 6=ci P(r|c)
P
Remark
The symbol (bit) error probability depends on the explicit mapping between information and codewords,
the block error probability does not.