100% found this document useful (6 votes)
442 views

Convolutional Codes With Sequential Decoding (Soft Decisions)

This document discusses convolutional codes with sequential decoding using soft decisions. It begins with an introduction to sequential decoding and soft-decision decoding. Sequential decoding allows efficient searching of the trellis diagram to find the maximum likelihood path. Soft-decision decoding finds the codeword that minimizes the Euclidean distance between the received signal and modulated codeword. The document provides an example of soft-decision decoding for a convolutional code using sequential decoding to calculate cumulative Euclidean distances and select the path with the smallest distance.
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPT or read online on Scribd
100% found this document useful (6 votes)
442 views

Convolutional Codes With Sequential Decoding (Soft Decisions)

This document discusses convolutional codes with sequential decoding using soft decisions. It begins with an introduction to sequential decoding and soft-decision decoding. Sequential decoding allows efficient searching of the trellis diagram to find the maximum likelihood path. Soft-decision decoding finds the codeword that minimizes the Euclidean distance between the received signal and modulated codeword. The document provides an example of soft-decision decoding for a convolutional code using sequential decoding to calculate cumulative Euclidean distances and select the path with the smallest distance.
Copyright
© Attribution Non-Commercial (BY-NC)
Available Formats
Download as PPT or read online on Scribd
You are on page 1/ 33

Convolutional Codes with

Sequential Decoding (Soft


Decisions)
Melissa Frances Balmes
Chariz Sandra Ramos
• Introduction
• Sequential Decoding
- Introduction to Sequential Decoding
- Decoding Process
- Example of Sequential Decoding
• Soft-decision Decoding
- Introduction to Soft-decision Decoding
- Hard-decision vs. Soft-decision
- Decoding Process
- Example of Soft-decision Decoding for Convolutional
Codes
• Example of Convolutional Codes
with Sequential Decoding Soft
Decision
Introduction
• Recall…
Convolutional codes parameters:
(n,k,m) n = number of output bits
k = number of input bits
m = number of memory registers
rate: r = k / n

• Idea behind decoding:


A code with n input bits results in 2n possible
output sequences
decoder’s task is to determine which
sequence was sent
Introduction…
• How do we decode a received sequence: 111100?
By bit agreement: comparing the received
sequence with all the 8 possible sequence and
choose the one with the smallest Hamming
distance (bit disagreement)

• BUT!!!
- this type of decoding shows ambiguity
- as the number of input bits increase, the
possible code sequence increase = longer
calculations!!
Sequential
Decoding
Introduction
• One of the first methods proposed by
Wozencraft to decode a
convolutionally coded bit stream
• The purpose is to search all the
nodes of the diagram in an efficient
way in an attempt to find the
maximum likelihood path
• Maximum likelihood path depends on
the measure of “closeness” of a path
to the received sequence
Properties
• Allows both forward and backwards
movement through the trellis
• The decoder keeps track of its
decisions, each time it makes an
ambiguous decision, it tallies it. If the
tally increases faster than some
threshold value, decoder gives up
that path and retraces the path back
to the last fork where the tally was
below the threshold.
Steps to decoding:
• Choose the all-zero state to be the
initial node
• Compare the outputs of two paths
from the initial node and the
received sequence
– If an output of one path produced the
same output as the received sequence,
choose that path.
– If the output of both paths is not the
same as the received sequence, tally
the error.
Steps to decoding:
• Continue the process of comparing
and choosing paths until the end of
the received sequence
• NOTE: Error must not exceed the
threshold value!!!
- If a path’s tally of errors exceed the
threshold value, GO BACK to where the
tally was less than the threshold value and
find another path.
Example Problem
• The code is: (2,1,4) for which the
encoder diagrams was presented in the
previous report. Assume that bit
sequence 1011 000 was sent.
(Remember the last 3 bits are the result
of the flush bits. Their output is called
tail bits.)
• If no errors occurred, we would
receive:  11 11 01 11 01 01 11
• But let’s say we received instead :  01
11 01 11 01 01 11.  One error has
occurred. The first bit has been
I Input Output Output
npu State Bit State
t
I
Bit1 S1 S2 S3 O1 O2 S1 S2 S3
0 0 0 0 0 0 0 0 0
1 0 0 0 1 1 1 0 0
0 0 0 1 1 1 0 0 0
1 0 0 1 0 0 1 0 0
0 0 1 0 1 0 0 0 1
1 0 1 0 0 1 1 0 1
0 0 1 1 0 1 0 0 1
1 0 1 1 1 0 1 0 1
0 1 0 0 1 1 0 1 0
1 1 0 0 0 0 1 1 0
0 1 0 1 0 0 0 1 0
1 1 0 1 1 1 1 1 0
0 1 1 0 0 1 0 1 1
1 1 1 0 1 0 1 1 1
0 1 1 1 1 0 0 1 1
1 1 1 1 0 1 1 1 1
11

11
01
11
01

01
11
Comparing with the
encoded trellis…
Soft Decision
Decoding
Channel Model
y(t) = x(t) + n(t)

where:
y(t) = received
sequence
x(t) = encoded
sequence
n(t) = noise

x(t) y(t)
Hard Decision vs. Soft
Decision
Hard Decision Soft Decision
• find a codeword that • find a codeword that
minimizes the minimizes the
Hamming distance Euclidean distance
between the symbol between the received
sequence and the signal and the
codeword modulated signal
corresponding to the
codeword
**given 2 points P = (p1,
Hamming distance – is
the number of p2,..pn) and Q = (q1,
differing elements q2,..qn) the Euclidean
between two words distance is
Example of Soft Decision
decoding
A transmitted codevector = (101) = (1 –1 1) in polar
format
The received vector is r = (-0.1072, -10504, 1.6875)

After transmission over an Additive Noise Gaussian


channel
Example of Soft Decision
decoding
Calculating the Euclidean distance with each
of the possible codevectors, c0=(-1-1-1), c1=(-
111), c2=(1-11) and c3=(11-1)

A soft decision decoder would decide that the


decoded vector is the code that has the smallest
Euclidean distance: d(r,c2) = 1.3043
Thus, the decoded word is c2 = 101!!
Convolutional
Codes with
Sequential
Decoding Soft
Decision
Convolutional Codes with
Sequential Decoding Soft
Decision
For a Convolutional code with a rate of 1/n, there will be
2n possible outputs for each transition of the trellis:
each vector per transition has n components:
received sequence:

the value for each transition i-1 is the squared


Euclidean distance between the received sample at a
time instant ti and the possible output:

the cumulative squared Euclidean distance for a path


of U transitions:
Steps:
• Compute the cumulative squared Euclidean
distance, dU2,
for each transition in the trellis diagram

• If 2 paths reach a certain node in the trellis,


the path with a higher dU2 is dropped

• At the last time instant, the correctly


decoded sequence is the path that has the
smallest dU2
Example
Message sequence: m = (10101)
Code sequence: c = (11, –11, 11, –1 –1, 11)
Code rate: 1/2
Received sequence:
sr = (1.35 –0.15, –1.25 1.4, –0.85 –0.1, 0.95 –
1.75, 0.5 1.3)
1. Compute the cumulative SED for each
transition
sr = (1.35 –0.15, –1.25 1.4, –0.85
–0.1, –0.95 –1.75, 0.5 1.3)
For t1  t2 , Sa  Sa
d02 = [(1.35, -0.15), (-1 –1)]
= (1.35 + 1)2 + (-0.15 + 1)2
d02 = 6.245 = d(U=1)2

For t2  t3 , Sa  Sa
d12 = [(-1.25, 1.4), (-1 –1)]
= (-1.25 + 1)2 + (1.4 + 1)2
= 5.8225
d(U=2)2 = d02 + d12 = 12.0675
2. If 2 paths reach a certain note, the
path with a higher dU2 is discarded
+1.35 -1.25 -0.85 -0.95 +0.50
-0.15 +1.40 -0.10 -1.75 +1.30
6.245 12.067 12.900 6.86 14.404
1. 5 1 4
6.299 24.064 10.404
44 1.
46
5 7
16.7 17.664 7.204
1. 2.499 13.264 17.604
66
7
12 12.699 10.064 15.604
.2
16.499 17.864 12.404
67

15.699 6.864 18.804


13.499 21.064 9.20
4
Message =
10101
Encoded
message =
11 01 11 00
11
References:
•D. G. Hoffman, D. A. Leonard, C. C. Lindner, K. T. Phelps, C.
A. Rodger, and J. R. Wall. Coding Theory: The
Essentials. Marcel Dekker, Inc., 1991.
•S. Lin and D.J. Costello, Jr. Error Control Coding:
Fundamentals and Applications. New Jersey: Prentice Hall,
Inc., 1983.
•J.G. Moreira and P.G. Farrell. Essentials of Error-Control
Coding. England: John Wiley & Sons, Ltd, 2006.
•M. Purser. Introduction to Error-Correcting Codes. Boston:
Artech House, Inc., 1995.
•Tutorial on Coding and Decoding with Convolutional Codes
http://www.complextoreal.com/chapters/convo.pdf
Date Accessed: March 9, 2008

•Soft-Decision Minimum-Distance Sequential Decoding


Algorithm for Convolutional Codes Paper

You might also like