Lecture 9 Viterbi Decoding of Convolutional Code
Lecture 9 Viterbi Decoding of Convolutional Code
i=1
(u
i
v
i
)
2
, (9.2)
where u = u
1
, u
2
, . . . , u
p
are the expected p parity bits (each a 0 or 1). Figure 9-5 shows the
soft decision branch metric for p = 2 when u is 00.
With soft decision decoding, the decoding algorithm is identical to the one previously
described for hard decision decoding, except that the branch metric is no longer an integer
Hamming distance but a positive real number (if the voltages are all between 0 and 1, then
the branch metric is between 0 and 1 as well).
It turns out that this soft decision metric is closely related to the probability of the decoding
being correct when the noise is Gaussian. To keep the math simple, lets look at the simple
case of 1 parity bit (the more general case is a straightforward extension). Given a soft
decision metric of x
i
for the i
th
bit, the PDF that x
i
corresponds to the expected parity is
the Gaussian, f(x) =
e
x
2
/2
2
2
2
. The logarithm of this quantity, log f(x), is proportional to
x
2
.
The probability density of the entire decoding path being correct, assuming that the
noise process is independent (and identically distributed) is the product of the individual
PDFs,
i
f(x
i
). Observe, however, that log
i
f(x
i
) =
i
logf(x
i
)
i
x
2
i
.
The branch metric of Equation (9.2) leads to a path metric that is directly proportional
to log
i
f(x
i
). Minimizing this quantity is exactly the same as maximizing the PDF of a
correct decoding! Hence, for soft decision decoding with the metric described above, the
path metric is proportional to the log of the likelihood of the chosen path being correct
when the noise is Gaussian.
This connection with the logarithm of the probability is the reason why we chose the
6 LECTURE 9. VITERBI DECODING OF CONVOLUTIONAL CODES
sum of squares as the branch metric in Eq. (9.2). A different noise distribution (other
than Gaussian) may entail a different soft decoding branch metric to obtain an analogous
connection to the PDF of a correct decoding.
9.4 Summary
From its relatively modest, though hugely impactful, beginnings as a method to decode
convolutional codes, Viterbi decoding has become one of the most widely used algorithms
in a wide range of elds and engineering systems. Modern disk drives with PRML
technology to speed-up accesses, speech recognition systems, natural language systems,
and a variety of communication networks use this scheme or its variants.
In fact, a more modern view of the soft decision decoding technique described in this
lecture is to think of the procedure as nding the most likely set of traversed states in
a Hidden Markov Model (HMM). Some underlying phenomenon is modeled as a Markov
state machine with probabilistic transitions between its states; we see noisy observations
from each state, and would like to piece together the observations to determine the most
likely sequence of states traversed. It turns out that the Viterbi decoder is an excellent
starting point to solve this class of problems (and sometimes the complete solution).
On the other hand, despite its undeniable success, Viterbi decoding isnt the only way
to decode convolutional codes. For one thing, its computational complexity is exponential
in the constraint length, k, because it does require each of these states to be enumerated.
When k is large, one may use other decoding methods such as BCJR or Fanos sequential
decoding scheme, for instance.
Convolutional codes themselves are very popular over both wired and wireless links.
They are often used as the inner code with an outer block error correcting code, but they
may also be used with just an outer error detection code.
We are now at the end of the four lectures on error detection and error correcting codes.
Figure 9-6 summarizes the different pieces of what we have learned so far in the course
and how they t together. These pieces are part of the physical layer of a communication
network. We will get back to the different layers of a network in a few lectures, after
studying another physical layer topic: modulation (in the context of frequency division
multiplexing).
SECTION 9.4. SUMMARY 7
Figure 9-3: The Viterbi decoder in action. This picture shows 4 time steps. The bottom-most picture is the
same as the one just before it, but with only the survivor paths shown.
8 LECTURE 9. VITERBI DECODING OF CONVOLUTIONAL CODES
Figure 9-4: The Viterbi decoder in action (continued from Figure 9-3. The decoded message is shown. To
produce this message, start from the nal state with smallest path metric and work backwards, and then
reverse the bits. At each state during the forward pass, it is important to remeber the arc that got us to this
state, so that the backward pass can be done properly.
SECTION 9.4. SUMMARY 9
!"!#$"! $"!#$"!
&
'!
#&
'$
!"# %&'()* +,&-
&./&*'&0 /1()'2
3)'4 1(& 565
!"!#!"! $"!#!"!
&
(
'!
) &
(
'$
Figure 9-5: Branch metric for soft decision decoding.
Conv. Code
8b10b
Interleave
Block ECC
Packetize
Viterbi
10b8b
Deinterleave
Block ECC
-1
Validate
Bits to Volts
Volts to Bits
Split, add sequence # and checksum
Build ECC blocks = data + check bits
Ensure sufficient transitions, packet syncs
Continuous coding
Choose samples/bit to avoid ISI issues
Clock recovery, digitize middle sample
Find most likely transmission
Recover packets
Spread burst errors into 1-bit errors
Error correction using check bits
Verify packet checksum
Sender
Receiver
Order as 1
st
bits of all ECC blocks, 2
nd
bits,
Figure 9-6: How the pieces we have learned so far t together.