0% found this document useful (0 votes)
31 views

Convolutional Codes: N, K, M K-N M

Convolutional codes encode input blocks into output blocks that depend on the current and previous input blocks. They can be represented using shift registers, generator matrices, trellises, and other methods. An example (3,2,1) convolutional code is described that encodes two input bits into three output bits based on the current and previous input bits. Convolutional codes can be systematic, meaning the input bits are directly part of the output, and non-systematic. Parity check matrices can be derived from the generator matrices.

Uploaded by

Kishore Krishnan
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views

Convolutional Codes: N, K, M K-N M

Convolutional codes encode input blocks into output blocks that depend on the current and previous input blocks. They can be represented using shift registers, generator matrices, trellises, and other methods. An example (3,2,1) convolutional code is described that encodes two input bits into three output bits based on the current and previous input bits. Convolutional codes can be systematic, meaning the input bits are directly part of the output, and non-systematic. Parity check matrices can be derived from the generator matrices.

Uploaded by

Kishore Krishnan
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 18

Convolutional codes

a) An (n,k,m) convolutional encoder will encode a k-bit input


block into an n-bit ouput block, which depends on the
current input block and the m preceding input blocks
b) History:
Elias (1955): Introduced the codes
Wozencraft (1961): Sequential decoding
Massey (1963): Majority logic decoding
Viterbi (1967): ML decoding
Berrou et al. (1993): Turbo codes

Descriptions and representations


a) Shift register representation
b) Scalar encoder matrix representation
c) Polynomial matrix representation
d) State diagram representation
e) Trellis representation
f) Tree representation
g) Parity check matrix/syndrome former
representation
2

Example

a) Controller canonical form


k input sequences enter from the left;
n output sequences produced by external mod-2 adders
b) Input sequence u = (u0, u1, u2, )
c) Output sequence v = (v0(0), v0(1), v1(0), v1(1), v2(0), v2(1), )
d) Impulse response sequences: u = (1, 0, 0, 0, ) produces output
g(0) = (1, 0, 1, 1)
g(1) = (1, 1, 1, 1)
3

Example (cont.)

Input sequence u = (u0, u1, u2, )

Output sequence v = (v0(0), v0(1), v1(0), v1(1), v2(0), v2(1), )

Impulse response sequences: u = (1, 0, 0, 0, ) produces output


g(0) = (1, 0, 1, 1)

Encoding equations
+ ul - 2 + ul - 3
vl(0) = ul

g(1) = (1, 1, 1, 1)

vl(1) = ul + ul - 1 + ul - 2 + ul 3

Or, simplified
v(0) = u * g(0), v(1) = u * g(1)
* is a discrete convolution
In general: v(j) = u * g(j) = i=0..m ul i gi(j) where ul i = 0 for l<i 4

Scalar encoder matrix method

G=

g 00

g 01

g 10

g 1
1

g 20

g 1
2

g m0

g 1
m

g 00

g 1
0

g 10

g 1
1

g 20

g 1
2

g m0

g m1

g 00

g 1
0

g 10

g 1
1

g 20

g 1
2

g m0

g 1
m

a) Encoding of u: v = uG
b) Example (continued): For u = (1 0 1 1 1)

1 1 0 1 1 1 1 1
1 1 0 1 1 1 1 1
v=uG =1 0 1 1 1
1 1 0 1 1 1 1 1
1 1 0 1 1 1 1 1
1 1 0 1 1 1 1 1
=1 1 0 1 0 0 0 1 0 1 0 1 0 0 1 1
5

Example: (3,2,1) convolutional code

a) Impulse response sequences: gi(j) corresponds to input i and output j

g1(0) = (1, 1), g1(1) = (0, 1), g1(2) = (1, 1)

g2(0) = (0, 1), g2 (1) = (1, 0), g2(2) = (1, 0)

Example: (3,2,1) CC

g1(0) = (1, 1), g1(1) = (0, 1), g1(2) = (1, 1)

g2(0) = (0, 1), g2 (1) = (1, 0), g2(2) = (1, 0)

a) Encoding equations
vl(0) = ul(1) + ul - 1(1) + ul - 1(2)

vl(1) = ul(2) + ul - 1(1)

vl(2) = ul(1) + ul(2) + ul - 1(1)

b) Or, simplified
v(0) = u(1) * g1(0) + u(2) * g2(0)

v(1) = u(1) * g1(1) + u(2) * g2(1)

v(2) = u(1) * g1(2) + u(2) * g2(2)


7

(3,2,1) CC, generator matrix

1 0 1 1 1
0 1 1 1 0
G=
1 0
0 1

1
0
1 1 1 1
1 1 0 0

In general for an (n,k,m) CC, the generator matrix is

G0 G1 G 2 G m
G0 G1 G 2 G m
G=
G0 G1 G 2 G m

where

g 1,0l
Gl =
g k,l0

n1
g 1

g
1, l
1, l

n1
g 1

g
k,l
k,l
8

Important definitions
a) Nominal code rate R = k/n
b) Length of ith shift register = i
c) Encoder memory order m = max1ik i
d) The overall constraint length = 1ik i
e) An (n,k,) convolutional code is the set of all output sequences
produced by an (n,k,) convolutional encoder
f) Effective code rate Reff : Input length kL, output length n(L+m),
and Reff = kL / n(L+m) = RL/(L+m)

Polynomial matrix representation


a) A (2,1,m) CC can be described by a polynomial (transform domain)
representation of the input, output, and generator sequences
u(D) = u0 + u1D + u2D2 +

v(i)(D) = v0(i) + v1(i)D + v2(i)D2 +

for i = 0,1

g(i)(D) = g0(i) + g1(i)D + g2(i)D2 + + gm(i)Dm

for i = 0,1

b) Then, the encoding can be written as


v(i)(D) = u(D) g(i)(D)
c) Combining the output streams:
v = (v0(0), v0(1), v1(0), v1(1), v2(0), v2(1), )

for i = 0,1

v(D) = v(0)(D2) + Dv(1)(D2)


10

Example: (2,1,3) code


g(0) = (1, 0, 1, 1), g(1) = (1, 1, 1, 1)
g(0)(D) = 1 + D2 + D3, g(1)(D) = 1 + D + D2 + D3
Encoding equations:
v(i)(D) = u(D) g(i)(D)
Polynomial matrix form (transform domain form):
(v(0)(D), v(1)(D)) = u(D) (g(0)(D), g(1)(D))
In general:

0
g1 D

1
g1 D

for i=0,1

n1
g1 D

G D =

1
n1
g0

g
D
k
k
k

11

Transform domain:
Relation to constraint length
a) Length of ith shift register: i = max0jn-1 deg gi(j)(D)

12

Systematic encoders
a) An (n,k,m) convolutional encoder is systematic if the first k output
sequences are a copy of the k information sequences
b) All convolutional codes have systematic encoders, but there are codes
that do not have feedforward systematic encoders

G D =

0 0

g1k D g1n1 D

1 0

0 1

g2k D

gkk D

g2n1 D

gkn1 D

IP 0 0P 1 0P 2 0P m
IP 0 0P 1 0P 2 0P m
G=
IP 0 0P 1 0P 2 0P m

13

Parity check matrices


a) Starting with a systematic feedforward generator matrix it is easy to
find a parity check matrix for the code:

Thus, for any codeword v


vHT = 0
14

Transform domain
Starting with a systematic feedforward generator matrix it is easy to
find a parity check matrix for the code:

Thus, for any codeword V(D) = (v(0)(D), v(1)(D), ..., v(n-1)(D)), V(D)HT
(D) = 0(D)
Parity check matrices exist for all convolutional codes (but they are less
straightforward to find)
15

Example: Systematic encoder

a) G(D) = [1, 1 + D + D3]

16

Another example,
and another encoder form

1 0 1+D+D 2
G D =
0 1
1+D

H D = D 2 +D+ 1

D+ 1 1

a) Controller canonical form


b) Observer canonical form
One shift register per output sequence and mod-2 adders internal
to the shift registers
Highest degree term at the left
c) Note: Different memory requirements
17

Systematic feedback (recursive) encoders

G(D) = [1 + D + D2, 1 + D2, 1 + D]

G(D) = [D2 + D + 1, D2 + 1, D + 1]

G(D) = [1, (1 + D2) / (1 + D + D2) , (1 + D) / (1 + D + D2) ]

Infinite impulse response (not polynomial)


Easier to represent in transform domain
Generates the same set of words as G(D)
Different encoder mapping
Again easy to obtain parity check matrix

18

You might also like