Convolutional Codes With Sequential Decoding (Soft Decisions)
Convolutional Codes With Sequential Decoding (Soft Decisions)
• BUT!!!
- this type of decoding shows ambiguity
- as the number of input bits increase, the
possible code sequence increase = longer
calculations!!
Sequential
Decoding
Introduction
• One of the first methods proposed by
Wozencraft to decode a
convolutionally coded bit stream
• The purpose is to search all the
nodes of the diagram in an efficient
way in an attempt to find the
maximum likelihood path
• Maximum likelihood path depends on
the measure of “closeness” of a path
to the received sequence
Properties
• Allows both forward and backwards
movement through the trellis
• The decoder keeps track of its
decisions, each time it makes an
ambiguous decision, it tallies it. If the
tally increases faster than some
threshold value, decoder gives up
that path and retraces the path back
to the last fork where the tally was
below the threshold.
Steps to decoding:
• Choose the all-zero state to be the
initial node
• Compare the outputs of two paths
from the initial node and the
received sequence
– If an output of one path produced the
same output as the received sequence,
choose that path.
– If the output of both paths is not the
same as the received sequence, tally
the error.
Steps to decoding:
• Continue the process of comparing
and choosing paths until the end of
the received sequence
• NOTE: Error must not exceed the
threshold value!!!
- If a path’s tally of errors exceed the
threshold value, GO BACK to where the
tally was less than the threshold value and
find another path.
Example Problem
• The code is: (2,1,4) for which the
encoder diagrams was presented in the
previous report. Assume that bit
sequence 1011 000 was sent.
(Remember the last 3 bits are the result
of the flush bits. Their output is called
tail bits.)
• If no errors occurred, we would
receive: 11 11 01 11 01 01 11
• But let’s say we received instead : 01
11 01 11 01 01 11. One error has
occurred. The first bit has been
I Input Output Output
npu State Bit State
t
I
Bit1 S1 S2 S3 O1 O2 S1 S2 S3
0 0 0 0 0 0 0 0 0
1 0 0 0 1 1 1 0 0
0 0 0 1 1 1 0 0 0
1 0 0 1 0 0 1 0 0
0 0 1 0 1 0 0 0 1
1 0 1 0 0 1 1 0 1
0 0 1 1 0 1 0 0 1
1 0 1 1 1 0 1 0 1
0 1 0 0 1 1 0 1 0
1 1 0 0 0 0 1 1 0
0 1 0 1 0 0 0 1 0
1 1 0 1 1 1 1 1 0
0 1 1 0 0 1 0 1 1
1 1 1 0 1 0 1 1 1
0 1 1 1 1 0 0 1 1
1 1 1 1 0 1 1 1 1
11
11
01
11
01
01
11
Comparing with the
encoded trellis…
Soft Decision
Decoding
Channel Model
y(t) = x(t) + n(t)
where:
y(t) = received
sequence
x(t) = encoded
sequence
n(t) = noise
x(t) y(t)
Hard Decision vs. Soft
Decision
Hard Decision Soft Decision
• find a codeword that • find a codeword that
minimizes the minimizes the
Hamming distance Euclidean distance
between the symbol between the received
sequence and the signal and the
codeword modulated signal
corresponding to the
codeword
**given 2 points P = (p1,
Hamming distance – is
the number of p2,..pn) and Q = (q1,
differing elements q2,..qn) the Euclidean
between two words distance is
Example of Soft Decision
decoding
A transmitted codevector = (101) = (1 –1 1) in polar
format
The received vector is r = (-0.1072, -10504, 1.6875)
For t2 t3 , Sa Sa
d12 = [(-1.25, 1.4), (-1 –1)]
= (-1.25 + 1)2 + (1.4 + 1)2
= 5.8225
d(U=2)2 = d02 + d12 = 12.0675
2. If 2 paths reach a certain note, the
path with a higher dU2 is discarded
+1.35 -1.25 -0.85 -0.95 +0.50
-0.15 +1.40 -0.10 -1.75 +1.30
6.245 12.067 12.900 6.86 14.404
1. 5 1 4
6.299 24.064 10.404
44 1.
46
5 7
16.7 17.664 7.204
1. 2.499 13.264 17.604
66
7
12 12.699 10.064 15.604
.2
16.499 17.864 12.404
67