0% found this document useful (0 votes)
13 views27 pages

Lecture 8b Channel Coding 2024

Uploaded by

lhduong2506
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
13 views27 pages

Lecture 8b Channel Coding 2024

Uploaded by

lhduong2506
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

Viet Nam National University Ho Chi Minh City

University of Science
Faculty of Electronics & Telecommunications

Chapter 8b: Error Correcting Codes:

Convolutional codes

Dang Le Khoa
[email protected]
Introduction

⚫ In block coding, the encoder accepts k-bit message block and


generates n-bit codewordBlock-by-block basis
⚫ Encoder must buffer an entire message block before
generating the codeword
⚫ When the message bits come in serially rather than in large
blocks, using buffer is undesirable
⚫ Convolutional coding

2
Definitions

⚫ An convolutional encoder: a finite-state machine that consists


of an M-stage shift register, n modulo-2 adders
⚫ L-bit message sequence produces an output sequence with
n(L+M) bits
⚫ Code rate:
L
r= (bits/symbol)
n( L + M )

⚫ L>>M, so
1
r= (bits/symbol)
n

3
Definitions

⚫ Constraint length (K): the number of shifts over which a


single message bit influence the output
⚫ M-stage shift register: needs M+1 shifts for a message to
enter the shift register and come out
⚫ K=M+1

4
Example

⚫ Convolutional code (2,1,2)


– n=2: 2 modulo-2 adders or 2 outputs
– k=1: 1 input
– M=2: 2 stages of shift register (K=M+1=2+1=3)

Path 1

Input
Output

Path 2
5
Example
⚫ Convolutional code (3,2,1)
– n=3: 3 modulo-2 adders or 3 outputs
– k=2: 2 input
– M=1: 1 stages of each shift register (K=2 each

Output
Input

6
Generations

⚫ Convolutional code is nonsystematic code


⚫ Each path connecting the output to the input can be
characterized by impulse response or generator polynomial
(i ) (i ) (i ) (i )
⚫ ( g M ,..., g 2 , g1 , g 0 ) denoting the impulse response of the ith path
⚫ Generator polynomial of the ith path:
g (i ) ( D) = g M(i ) D M + ... + g 2(i ) D 2 + g1(i ) D + g0(i )
⚫ D denotes the unit-delay variabledifferent from X of cyclic
codes
⚫ A complete convilutional code described by a set of
polynomials { g (1) ( D), g ( 2) ( D),..., g ( n) ( D) }

7
Example(1/8)

⚫ Consider the case of (2,1,2)


⚫ Impulse response of path 1 is (1,1,1)
⚫ The corresponding generator polynomial is
g (1) ( D) = D2 + D + 1

⚫ Impulse response of path 2 is (1,0,1)


⚫ The corresponding generator polynomial is
g ( 2) ( D) = D 2 + 1

⚫ Message sequence (11001)


⚫ Polynomial representation: m( D) = D4 + D3 + 1

8
Example(2/8)
⚫ Output polynomial of path 1:
c (1) ( D) = m( D) g (1) ( D)
= ( D 4 + D 3 + 1)( D 2 + D + 1)
= D6 + D5 + D 4 + D5 + D 4 + D3 + D 2 + D + 1
= D6 + D3 + D 2 + D + 1

⚫ Output sequence of path 1 (1001111)


⚫ Output polynomial of path 2:
c ( 2 ) ( D ) = m( D ) g ( 2 ) ( D )
= ( D 4 + D 3 + 1)( D 2 + 1)
= D6 + D 4 + D5 + D3 + D 2 + 1

⚫ Output sequence of path 2 (1111101)


9
Example(3/8)

⚫ m= (11001)
⚫ c(1)=(1001111)
⚫ c(2)=(1111101)
⚫ Encoded sequence c=(11,01,01,11,11,10,11)
⚫ Message length L=5bits
⚫ Output length n(L+K-1)=14bits
⚫ A terminating sequence of K-1=2 zeros is appended to
the last input bit for the shift register to be restored to
its zero initial state

10
Example(4/8)

⚫ Another way to calculate the output:


⚫ Path 1:
m 111 output
001001 1 1
00100 11 0
0010 011 0
001 001 1 1
00 100 11 1
0 010 011 1
001 0011 1
c(1)=(1001111)
11
Example(5/8)
⚫ Path 2

m 101 output
001001 1 1
00100 11 1
0010 011 1
001 001 1 1
00 100 11 1
0 010 011 0
001 0011 1
c(2)=(1111101)

12
Example(6/8)

⚫ Consider the case of (3,2,1)

Output
Input

⚫ gi( j ) = ( gi(,Mj ) , gi(,Mj ) −1 ,..., gi(,1j ) , g i(,0j ) )


denoting the impulse
response of the jth path corresponding to ith input
13
Example(7/8)

Output
Input

g1(1) = (11)  g1(1) ( D) = D + 1


g 2(1) = (01)  g1(1) ( D) = 1
g1( 2) = (01)  g1( 2 ) ( D) = 1
g 2( 2) = (10)  g 2( 2) ( D) = D
g1(3) = (11)  g1(1) ( D) = D + 1
g 2(3) = (10)  g1(1) ( D) = D

14
Example(8/8)
⚫ Assume that:
– m(1)=(101)m(1)(D)=D2+1
– m(2)=(011)m(1)(D)=D+1
⚫ Outputs are:
⚫ c(1)=m(1)*g1(1)+m(2)*g2(1)
= (D2+1)(D+1)+(D+1)(1)
=D3+D2+D+1+D+1=D3+D2c(1)=(1100)
⚫ c(2)=m(1)*g1(2)+m(2)*g2(2)
= (D2+1)(1)+(D+1)(D)
=D2+1+D2+D=D+1 c(2)=(0011)
⚫ c(3)=m(1)*g1(3)+m(2)*g2(3)
= (D2+1)(D+1)+(D+1)(D)
=D3+D2+D+1+D2+D=1=D3+1 c(3)=(1001)
⚫ Output c=(101,100,010,011)
15
State diagram

Consider convolutional code (2,1,2) state Binary


1/10 description
d a 00
11 b 10
c 01
1/01 0/01 d 11
0/10

⚫ 4 possible states
b 10 01 c
⚫ Each node has 2 incoming branches,
1/00
2 outgoing branches
0/11 ⚫ A transition from on state to another
1/11 in case of input 0 is represented by a
solid line and of input 1 is
00 represented by dashed line
a ⚫ Output is labeled over the transition
line 16
0/00
1/10

d
Example 11

1/01 0/01
⚫ Message 11001 0/10

⚫ Start at state a b 10 01 c

⚫ Walk through the state 1/00


diagram in accordance 0/11
1/11
with message
sequence 00
a
0/00

Input 1 1 0 0 1 0 0

State 00 10 11 01 00 10 01 00
a b d c a b c a
Output 11 01 01 11 11 10 11
17
Trellis(1/2)

0/00 0/00 0/00 0/00 0/00 0/00 0/00 0/00


a=00

b=10

c=01

d=11
1/10 1/10 1/10 1/10
Level j=0 1 2 3 4 5 L-1 L L+1 L+2

18
Trellis(1/2)
⚫ The trellis contains (L+K) levels
⚫ Labeled as j=0,1,…,L,…,L+K-1
⚫ The first (K-1) levels correspond to the encoder’s
departure from the initial state a
⚫ The last (K-1) levels correspond to the encoder’s
return to state a
⚫ For the level j lies in the range K-1jL, all the
states are reachable

19
⚫ Message 11001 Example
Input 1 1 0 0 1 0 0

0/00 0/00 0/00 0/00 0/00 0/00 0/00


a=00

b=10

c=01

d=11
1/10 1/10 1/10
Level j=0 1 2 3 4 5 6 7

11 01 01 11 11 10 11

Output

20
Simulation of Convolution Encoding

[21]
Maximum Likelihood Decoding of
Convolutional codes
⚫ m denotes a message vector
⚫ c denotes the corresponding code vector
⚫ r denotes the received vector
⚫ With a given r , decoder is required to make estimate m̂ of
message vector, equivalently produce an estimate ĉ of the
code vector
⚫ mˆ = m only if cˆ = c otherwise, a decoding error happens
⚫ Decoding rule is said to be optimum when the propability of
decoding error is minimized
⚫ The maximum likelihood decoder or decision rule is
described as follows:
– Choose the estimate ĉ for which the log-likelihood
function logp(r/c) is maximum
22
Maximum Likelihood Decoding of
Convolutional codes

⚫ Binary symmetric channel: both c and r are binary


sequences of length N
N
p (r | c) =  p (ri | ci )
i =1
N
 log p(r | c) =  log p(ri | ci )
i =1

 p if ri  ci
with p(ri | ci ) = 
1 − p if ri = ci

⚫ r differs from c in d positions, or d is the Hamming


distance between r and c
 log p(r | c) = d log p + ( N − d ) log(1 − p)
 p 
= d log  + N log(1 − p)
1− p 
23
Maximum Likelihood Decoding of
Convolutional codes

⚫ Decoding rule is restated as follows:


– Choose the estimate ĉ that minimizes the Hamming
distance between the received vector r and the
transmitted vector c
⚫ The received vector r is compared with each
possible code vector c, and the one closest to r is
chosen as the correct transmitted code vector

24
The Viterbi algorithm
⚫ Choose a path in the trellis whose coded sequence differs from
the received sequence in the fewest number of positions
⚫ The algorithm operates by computing a metric for every
possible path in the trellis
⚫ Metric is Hamming distance between coded sequence
represented by that path and received sequence
⚫ For each node, two paths enter the node, the lower metric is
survived. The other is discarded
⚫ Computation is repeated every level j in the range K-1 jL
⚫ Number of survivors at each level 2K-1=4

25
The Viterbi algorithm
c=(11,01,01,11,11,10,11), r=(11,00,01,11,10,10,11)
Input 11 00 01 11 10 10 11

0/00 2 0/00
2 0/00
3 2 0/00 4 1 0/00 4 2 0/00
3 0/00 5
a=00
2

0 4 32 32 24
b=10

1
c=01 6 43 25 25
1

1 4
d=11 3
1/10 4 3 1/10 4 3 1/10
Code 11 01 01 11 11 10 11 26
Output 1 1 0 0 1 0 0

Faculty of Electronics & Telecommunications. HCMUS [26]


Simulation of Convolution Decoding

27

You might also like