0% found this document useful (0 votes)
67 views

Lecture 5

This document provides an overview of channel coding techniques. It discusses block codes like the (7,4) Hamming code and introduces convolutional codes which have memory. Convolutional codes generate codewords where the output bits depend not only on the most recent input bit but also previous bits stored in a shift register. The document uses examples to demonstrate encoding, decoding, and error correction for channel coding. Its goal is to explain the basic concepts of channel coding to improve data transmission over noisy channels.

Uploaded by

vdsignfeb
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
67 views

Lecture 5

This document provides an overview of channel coding techniques. It discusses block codes like the (7,4) Hamming code and introduces convolutional codes which have memory. Convolutional codes generate codewords where the output bits depend not only on the most recent input bit but also previous bits stored in a shift register. The document uses examples to demonstrate encoding, decoding, and error correction for channel coding. Its goal is to explain the basic concepts of channel coding to improve data transmission over noisy channels.

Uploaded by

vdsignfeb
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 60

EIE579 Advanced Telecommunication Systems

Lecture 5: Channel Coding – Convolutional Code

LIU Liang

Assistant Professor
Department of Electronic and Information Engineering
The Hong Kong Polytechnic University

0
Lecture 2, Lecture 3, and Lecture 4 Review

q If source signal is an analog signal, then sampling and quantization (Lecture 2)


q If source signal is a digital signal, then source coding (Lecture 3)

1
An Example for A/D and D/A Conversion

q Sampling rate: 5 Hz
q Number of quantization bits per sample: 3
2
Sampling at Transmitter

q 5 samples per second


q > Nyquist sampling rate (why not 1 billion samples per second)
3
Quantization at Transmitter

100
101

111
110

010

011
001
000

q Quantization bits for 10 samples


101 100 110 000 001 101 100 110 000 001
4
De-Quantization at Receiver

100
101

111
110

010

011
001
000

q Receiver recovers 10 samples based on quantization bits


101 100 110 000 001 101 100 110 000 001
5
De-Sampling at Receiver

estimated signal

q Receiver recovers signal based on 10 samples

6
Lecture 3: Source Coding
q Example:

Root
Edge 1

a:0.42 0.58
Leaf
Huffman Tree 0.23 0.35

e:0.11 f:0.12 0.15 b:0.2


Leaf Leaf Leaf

c:0.05 d:0.1
Leaf Leaf
7
Encoding
q Encoding rule: from root to each leaf; on each edge
Ø Turn left: 0
Ø Turn right: 1 Root
0 1 1

a:0.42 0
0.58 1
Leaf
Huffman Tree 0.23 0.35
0 1 0 1

e:0.11 f:0.12 0.15 b:0.2


0 1
Leaf Leaf Leaf

c:0.05 d:0.1
Leaf Leaf
8
Encoding
q Encoding rule: combining all bits on edges from a root to a leaf

Root
0 1 1

a:0.42 0
0.58 1
Leaf
Huffman Tree 0.23 0.35
0 1 0 1

e:0.11 f:0.12 0.15 b:0.2


0 1
Leaf Leaf Leaf

c:0.05 d:0.1
Leaf Leaf
9
Decoding
q Encoding rule: moving from root until to a leaf

Root
0 1 1

a:0.42 0
0.58 1
Leaf
Huffman Tree 0.23 0.35
0 1 0 1

e:0.11 f:0.12 0.15 b:0.2


0 1
Leaf Leaf Leaf

c:0.05 d:0.1
Leaf Leaf
10
De-Quantization at Receiver

100
101

111
110

010

011
001
000

101 100 110 000 001 101 100 110 000 001
001 100 110 000 001 101 110 110 000 001
11
Decoding
q Encoding rule: moving from root until to a leaf

Root
0 1 1

a:0.42 0
0.58 1
Leaf
Huffman Tree 0.23 0.35
0 1 0 1

e:0.11 f:0.12 0.15 b:0.2


0 1
Leaf Leaf Leaf

c:0.05 d:0.1
Leaf Leaf
12
Lecture 4: Channel Coding

q When transmitter sends a bit of 1(0), receiver may decode it as 0(1)


Ø Noise: random unwanted signals that are interfering (or corrupting) with the captured /
stored / transmitted / manipulated signal

13
Example of Decoding Error

Root
0 1 1

a:0.42 0
0.58 1
Leaf
Huffman Tree 0.23 0.35
0 1 0 1

e:0.11 f:0.12 0.15 b:0.2


0 1
Leaf Leaf Leaf

c:0.05 d:0.1
Leaf Leaf
q Transmitter sends 111, but receiver receives 101
14
Channel Coding

q Channel coding: Block code and convolutional code


q Block code: Hamming code

15
(7,4) Hamming Code
q Encoding

16
(7,4) Hamming Code
q Decoding
Ø Received code: where
Ø Error syndrome: , where parity check matrix is

Ø Suppose that there is at most 1 error bit

17
Example
q Suppose that there is 1 error bit in
q Received code is
Ø Error syndrome:
Ø The second row of

Ø Codeword:
Ø Message:

18
Lecture 5: Convolutional Code
q Coding techniques are classified as either block codes or convolutional codes, depending
on the presence or absence of memory
Ø Block code
Ø Convolutional code
q A convolutional code has memory
Ø The codeword bits depend not only on the recent bit that just entered the shift register,
but also on the previous bits

19
Convolutional Code: Applications

q Wi-Fi (802.11 standard) and cellular networks (3G, 4G, LTE standards)
q Deep space satellite communications
q Digital Video Broadcasting (Digital TV)

20
Outline

q Convolutional code: How to encode


q Convolutional code: How to use Viterbi algorithm for decoding (optional)
Ø Viterbi algorithm for convolutional code will not be in quiz
Ø But, there will be a bonus project about this algorithm (details shown next
Tuesday)

21
Encoding
(7,4) Hamming code
q Don’t send message bits, send only parity bits
q Use a sliding window to select which message
bits may participate in the parity calculations

InputInput
bit bit

Message bits: 1 0 1 1 0 1 0 0 1 0 1

Constraint length K

22
Preliminary

q Modulo-2 addition

q Modulo-2 multiplication

q Message bits: m=[…m[-1],m[0],m[1],…]


Ø m[n]: the message bit at time slot n
q Codeword: c=[c1[0],c2[0],…cI[0],c1[1],c2[1],…,cI[1],…]
Ø Generate I codeword bits corresponding to one message bit
Ø ci[n]: the i-th bit of the codeword generated at time slot n

23
Slicing Parity Bit Calculation: Single Parity Bit Case (I=1)

c1[n]=m[n]+m[n-1]+m[n-3]
K=4

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
c1[0]= 0
• Output: 0
24
Slicing Parity Bit Calculation: Single Parity Bit Case (I=1)

c1[n]=m[n]+m[n-1]+m[n-3]
K=4

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
c1[1]=1
• Output: 01
25
Slicing Parity Bit Calculation: Single Parity Bit Case (I=1)

c1[n]=m[n]+m[n-1]+m[n-3]
K=4

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
c1[2] = 0
• Output: 010
26
Slicing Parity Bit Calculation: Single Parity Bit Case (I=1)

c1[n]=m[n]+m[n-1]+m[n-3]
K=4

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
c1[3] = 1
• Output: 0100
27
Slicing Parity Bit Calculation: Multiple Parity Bits Case (I=2)
c2[3]=1
c1[n]=m[n]+m[n-1]+m[n-3]

+ c2[n]=m[n-2]+m[n-3]

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
• Output: ….11 c1[3]=1

28
Slicing Parity Bit Calculation: Multiple Parity Bits Case (I=2)
c2[4] = 0
c1[n]=m[n]+m[n-1]+m[n-3]

+ c2[n]=m[n-2]+m[n-3]

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
• Output: ….1100 c1[4] = 0

29
Slicing Parity Bit Calculation: Multiple Parity Bits Case (I=2)
c2[5] = 1
c1[n]=m[n]+m[n-1]+m[n-3]

+ c2[n]=m[n-2]+m[n-3]

-3 -2 -1 0 1 2 3 4 5 6 7 8 …..
Message
0 0 0 0 1 1 0 1 0 0 1 0 1
bits:

+
• Output: ….110001 c1[5] = 0

30
Encoder State

q Constraint length K: at each time slot, input bit and K-1 bits of current state (stored in
register) determine state on next clock cycle
Ø At time slot n, the state is m[n-1], m[n-2], …, m[n-K+1]
Ø At time slot n+1, the state is m[n], m[n-1], …, m[n-K]
State Input bit

Message 0 0 0 0 1 1 0 1 0 0 1 0 1
bits:
Constraint length K
State Input bit

Message 0 0 0 0 1 1 0 1 0 0 1 0 1
bits:
Constraint length K
31
Transmitting Parity Bits

q Transmit the parity sequences, not the message itself


q At each time slot, the codeword bits are generated by one message bit and current state
consisting of K-1 bits
q If using multiple generators, interleave the bits of each generator

32
Code Rate

q Code rate is 1 / number of generators


Ø If there are 2 generators, code rate is ½
q Engineering tradeoff:
Ø More generators improves bit-error correction
Ø But decreases rate of the code (the number of message bits/s that can be transmitted)

33
Shift Register View

q Let us assume that there are two generators


q At time slot n
Ø One message bit m[n] in, two parity bits out
Ø Message bits shifted right by one, the incoming bit moves into the left-most register

c1[n]

m[n] m[n-1] m[n-2]

c2[n]

34
Shift Register View

q Let us assume that there are two generators


q At time slot n+1
Ø One message bit m[n+1] in, two parity bits out
Ø Message bits shifted right by one, the incoming bit moves into the left-most register

c1[n+1]

m[n+1] m[n] m[n-1]

c2[n+1]

35
Generators

q Let us assume that there are two generators


q At time slot n g1=[1 0 1] c1[n]=1*m[n]+0*m[n-1]+1*m[n-2]
g2=[1 1 1] c2[n]=1*m[n]+1*m[n-1]+1*m[n-2]

c1[n]

m[n] m[n-1] m[n-2]

c2[n]

36
Generators

q In general, there are I generators:


g1=[g1[1],…,g1[K]]
g2=[g2[1],…,g2[K]]

gI=[gI[1],…,gI[K]]
q At time slot n, the codeword bits are based on current message bit, current state, and
generators
c1[n]=g1[1]*m[n]+g1[2]*m[n-1]+…+g1[K]*m[n-K+1]
c2[n]=g2[1]*m[n]+g2[2]*m[n-1]+…+g2[K]*m[n-K+1]

cI[n]=gI[1]*m[n]+gI[2]*m[n-1]+…+gI[K]*m[n-K+1]

37
State Machine View: K=3 and I=2

q The constraint length is K=3


q Code rate is ½, and the generators are g1=[1 1 1], g2=[1 0 1]

c1[n]=1*m[n]+1*m[n-1]+1*m[n-2]
c2[n]=1*m[n]+0*m[n-1]+1*m[n-2]

Starting state
q State machine (at time slot n)
0/00
Ø States labeled with (m[n-1], m[n-2]) 1/11
00 10
Ø Arcs (connecting states) labeled with
0/10 1/01
0/11
m[n]/c1[n]c2[n] 1/00
01 11
0/01 1/10

38
State Machine View

q The constraint length is K=3


q Code rate is ½, and the generators are g1=[1 1 1], g2=[1 0 1]

c1[n]=1*m[n]+1*m[n-1]+1*m[n-2]
c2[n]=1*m[n]+0*m[n-1]+1*m[n-2]

Starting state
q Suppose the message bits are m=[1 0 1 1 0 0]
0/00 1/11
00 10

0/10 1/01
0/11
1/00
01 11
0/01 1/10

39
State Machine View

q The constraint length is K=3


q Code rate is ½, and the generators are g1=[1 1 1], g2=[1 0 1]

c1[n]=1*m[n]+1*m[n-1]+1*m[n-2]
c2[n]=1*m[n]+0*m[n-1]+1*m[n-2]
1 0 0

q Suppose the message bits are m=[1 0 1 1 0 0] Starting state


0/00
1/11
00 10
q Transmit: 11
0/10 1/01
0/11
1/00
01 11
0/01 1/10

40
State Machine View

q The constraint length is K=3


q Code rate is ½, and the generators are g1=[1 1 1], g2=[1 0 1]

c1[n]=1*m[n]+1*m[n-1]+1*m[n-2]
c2[n]=1*m[n]+0*m[n-1]+1*m[n-2]
0 1 0

q Suppose the message bits are m=[1 0 1 1 0 0] Starting state


0/00
1/11
00 10
q Transmit: 11 10 0/10
0/10 1/01
0/11
1/00

01 11
0/01 1/10

41
State Machine View

q The constraint length is K=3


q Code rate is ½, and the generators are g1=[1 1 1], g2=[1 0 1]

c1[n]=1*m[n]+1*m[n-1]+1*m[n-2]
c2[n]=1*m[n]+0*m[n-1]+1*m[n-2]
1 0 1

q Suppose the message bits are m=[1 0 1 1 0 0] Starting state


0/00
1/11
00 10
q Transmit: 11 10 00 0/10
0/10
0/11 1/01
1/00
01 11
0/01 1/10

42
State Machine View

q The constraint length is K=3


q Code rate is ½, and the generators are g1=[1 1 1], g2=[1 0 1]

c1[n]=1*m[n]+1*m[n-1]+1*m[n-2]
c2[n]=1*m[n]+0*m[n-1]+1*m[n-2]
1 1 0
q Suppose the message bits are m=[1 0 1 1 0 0] Starting state
0/00
1/11
00 10
q Transmit: 11 10 00 01 0/10
0/11 1/01
1/00
01 11
0/01 1/10

43
State Machine View

q The constraint length is K=3


q Code rate is ½, and the generators are g1=[1 1 1], g2=[1 0 1]

c1[n]=1*m[n]+1*m[n-1]+1*m[n-2]
c2[n]=1*m[n]+0*m[n-1]+1*m[n-2]
0 1 1

q Suppose the message bits are m=[1 0 1 1 0 0] Starting state


0/00
1/11
00 10
q Transmit: 11 10 00 01 01 0/10
0/11 1/01
1/00
01 11
0/01 1/10

44
State Machine View

q The constraint length is K=3


q Code rate is ½, and the generators are g1=[1 1 1], g2=[1 0 1]

c1[n]=1*m[n]+1*m[n-1]+1*m[n-2]
c2[n]=1*m[n]+0*m[n-1]+1*m[n-2]
0 0 1

q Suppose the message bits are m=[1 0 1 1 0 0] Starting state


0/00
1/11
00 10
q Transmit: 11 10 00 01 01 11 0/10
0/11 1/01
1/00
01 11
0/01 1/10

45
Outline

q Convolutional code: How to encode


q Convolutional code: How to use Viterbi algorithm for decoding (optional)
Ø Viterbi algorithm for convolutional code will not be in quiz
Ø But, there will be a bonus project about this algorithm (details shown next
Tuesday)

46
Viterbi Decoding

q Suppose that the message bits are


q The constraint length is K=3
q Code rate is ½, and the generators are g1=[1 1 1], g2=[1 0 1]

c1[n]=1*m[n]+1*m[n-1]+1*m[n-2]
c2[n]=1*m[n]+0*m[n-1]+1*m[n-2]

q Codeword is
q Suppose the received bits are
q Can we correctly decode the message bits even if there are 2 error bits in received bits?

47
Viterbi Decoding

If the state at time t=1 is 00, the first message bit that leads to the minimum
Hamming distance between its codeword and [01] is 0. The corresponding
Hamming distance is 1

If the state at time t=1 is 10, the first message bit that leads to the minimum
Hamming distance between its codeword and [01] is 1. The corresponding
Hamming distance is 1

48
Viterbi Decoding

If the state at time t=2 is 00, the first two message


bits that lead to the minimum Hamming distance
between their codeword and [01 01] are 00. The
If the state at time t=2 is 01, the first two message
corresponding Hamming distance is 2
bits that lead to the minimum Hamming distance
between their codeword and [01 01] are 10. The
If the state at time t=2 is 10, the first two message
corresponding Hamming distance is 3
bits that lead to the minimum Hamming distance
between their codeword and [01 01] are 01. The
If the state at time t=2 is 11, the first two message
corresponding Hamming distance is 2
bits that lead to the minimum Hamming distance
between their codeword and [01 01] are 11. The
corresponding Hamming distance is 1

49
Viterbi Decoding

If the state at time t=3 is 00, the first three


message bits may be 000 or 100

If the state at time t=3 is 00, the first three


message bits that lead to the minimum Hamming
distance between their codeword and [01 01 00]
are 000. The corresponding Hamming distance is 2

50
Viterbi Decoding

If the state at time t=3 is 01, the first three


message bits may be 010 or 110

If the state at time t=3 is 01, the first three


message bits that lead to the minimum Hamming
distance between their codeword and [01 01 00]
are 110. The corresponding Hamming distance is 2

51
Viterbi Decoding

If the state at time t=3 is 10, the first three


message bits may be 001 or 101

If the state at time t=3 is 10, the first three


message bits that lead to the minimum Hamming
distance between their codeword and [01 01 00]
are 101. The corresponding Hamming distance is 3

52
Viterbi Decoding

If the state at time t=3 is 11, the first three


message bits may be 011 or 111

If the state at time t=3 is 11, the first three


message bits that lead to the minimum Hamming
distance between their codeword and [01 01 00]
are 111. The corresponding Hamming distance is 2

53
Viterbi Decoding
Last two bits are always 1

If the state at time t=4 is 00, the first four message


bits that lead to the minimum Hamming distance
between their codeword and [01 01 00 00] are
0000. The corresponding Hamming distance is 2

54
Viterbi Decoding
Last two bits are always 1

There are two possible states at


time t=4 because the last message
bit is always 1

If the state at time t=4 is 01, the first four


message bits that lead to the minimum
Hamming distance between their
codewords and [01 01 00 00] are 1110.
The corresponding Hamming distance is 3

55
Viterbi Decoding
Last two bits are always 1

If the state at time t=5 is 00, the message


bits that lead to the minimum Hamming
distance between their codeword and [01
01 00 00 00] are 00000. The corresponding
Hamming distance is 2

The last state must be 00 because the last


two message bits are 00

56
Viterbi Decoding

57
Exercise

Can we decode the message bits based on correctly? Please work it out

58
Outline

q Convolutional code: How to encode


q Convolutional code: How to use Viterbi algorithm for decoding (optional)
Ø Viterbi algorithm for convolutional code will not be in quiz
Ø But, there will be a bonus project about this algorithm (details shown next
Tuesday)

59

You might also like