Information Theory and Coding-Autonomous
Information Theory and Coding-Autonomous
Uncertainty, Information and Entropy – Source coding Theorem – Huffman coding –Shannon
Fano coding – Discrete Memory less channels – channel capacity – channel coding Theorem –
Channel capacity Theorem.
COMPRESSION TECHNIQUES
Principles – Arithmetic coding – Image Compression – Graphics Interchange format – Tagged
Image File Format – Introduction to JPEG standards.
References:
1. Simon Haykin, “Communication Systems”, 4th Edition, John Wiley and Sons, 2001.
2. Fred Halsall, “Multimedia Communications, Applications Networks Protocols and Standards”,
Pearson Education, Asia 2002;
3. Mark Nelson, “Data Compression Book”, BPB Publication 1992.
4. Watkinson J, “Compression in Video and Audio”, Focal Press, London, 1995.
5.Andre Neubauer, Jurgen Freudenberger, Volker Kuhn, “Coding theory: Algorithms,
Architectures and Applications” John Wiley & Sons Ltd, Reprint 2012.
6. R Bose, “Information Theory, Coding and Cryptography”, TMH 2007.
7. Robert. H. Morelos- Zaragoza, “The Art of Error Correcting Coding”, Second Edition, John
Wiley & Sons Ltd, Reprint 2013.
8. R. Avudaiammal, “Information Coding Techniques”, 2nd Edition, Tata McGraw Hill Education
Pvt. Ltd.,2010
S.No Topics SREC PSG NIT(S) AU IIT Karunya University
Madras Universit of illinois
y
Uncertainty,
1 Information and √ √ √ √ √ √ √
Entropy
Source coding
2 √ √ √ √ √ √ √
Theorem
3 Huffman coding √ √ √ √ √
Shannon Fano
4 √ √ √ √ √
coding
Discrete Memory
5 √ √ √ √ √ √ √
less channels
6 channel capacity √ √ √ √ √ √ √
channel coding
7 √ √ √ √ √ √ √
Theorem
Channel capacity
8 √ √ √ √ √ √ √
Theorem.
Differential Pulse
9 √ √
code Modulation
Adaptive Differential
10 Pulse Code √ √
Modulation
Adaptive subband
11 √ √
coding
12 Delta Modulation √ √
Coding of speech
signal at low bit
13 √ √
rates (Vocoders,
LPC).
Linear Block codes
14 – Syndrome √ √ √ √
Decoding
Minimum distance
15 √ √ √ √
consideration
16 cyclic codes √ √ √ √ √
Generator
17 √ √ √ √ √
Polynomial
Parity check
18 √ √ √ √ √
polynomial
Encoder for cyclic
19 √ √ √ √ √
codes
calculation of
20 √ √ √ √ √
syndrome
Convolutional
21 √ √ √ √
codes.
Principles –
22 √ √
Arithmetic coding
23 Image Compression √ √
Graphics
24 √ √
Interchange format
Tagged Image File
25 √ √
Format
Introduction to
26 √ √
JPEG standards.
Linear Predictive
27 √ √
coding
28 code excited LPC √ √
Perceptual coding,
29 √ √
MPEG audio coders
30 Dolby audio coders √ √
Video compression
31 √ √
– Principles
Introduction to
H.261 & MPEG
32 √ √
Video standards.
NIT Suratkal-
EC382
INFORMATION THEORY AND CODING(3-0-0) 3
Communication systems and Information Theory, Measures of Information,
Coding for Discrete sources, Discrete memory-less channels and capacity, Noisy
channel coding theorem, Techniques for coding and decoding, Waveform
channels, Source coding with Fidelity criterion.
Course No : EE5142
Credits : 4
Syllabus :
Entropy, Joint Entropy and Conditional Entropy, Relative Entropy and Mutual Information,
Chain Rules, Data-Processing Inequality, Fano’s Inequality
4) Channel Capacity:
Symmetric Channels, Properties of Channel Capacity, Jointly Typical Sequences, Channel
Coding Theorem, Fano’s Inequality and the Converse to the Coding Theorem
Differential Entropy, AEP for Continuous Random Variables, Properties of Differential Entropy,
Relative Entropy, and Mutual Information, Coding Theorem for Gaussian Channels
7) Convolutional Codes:
Encoder Realizations and Classifications, Minimal Encoders, Trellis representation, MLSD and
the Viterbi Algorithm, Bit-wise MAP Decoding and the BCJR Algorithm
Text Books :
References :
Karunya University
Course outcome
The students understand the basics of information theory
The student gain knowledge to calculate channel capacity and other measures
The students understand the source coding techniques for text, audio and speech
The students understand the format of image compression
The students understand the concept of error control techniques
The students analyze and apply specific coding methods calculate the rate and error pr
obabilities.
Unit II - Source Coding: Text, Audio and Speech: Introduction to text coding, Adaptive Huffman
coding, Arithmetic coding, LZW algorithm - Introduction to Audio coding, Perceptual coding,
Masking techniques, Psychoacous-tic model, MPEG audio layers I, II, III –Dolby AC3
–Introduction to Speech Coding, Channel vocoder, Linear Predictive Coding.
Unit III - Source Coding: Image and Video: Image and Video formats –GIF, TIFF, SIF, CIF and
QCIF –Image compression –READ, JPEG –Video compression–Principles I, B, P frames –
Motion estimation –Motion Compensation –H.261 standard, MPEG standard
Unit IV - Error Control Coding: Block Codes: Introduction to error control coding–
Definition and Principles –Hamming weight, Hamming distance, minimum distance decoding
–Single parity codes, hamming codes, repetition codes –Linear block codes –Cyclic codes
–Syndrome calculation –Encoder and decoder - CRC
Text Book:
1.Andre Neubauer, Jurgen Freudenberger, Volker Kuhn, “Coding theory: Algorithms,
Architectures and Applications” John Wiley & Sons Ltd, Reprint 2012.
Reference Books:
1.Robert. H. Morelos- Zaragoza, “The Art of Error Correcting Coding ”, Second Edition, John
Wiley & Sons Ltd, Reprint 2013.
2.R. Avudaiammal, “Information Coding Techniques”, 2nd Edition, Tata McGraw Hill Education
Pvt. Ltd., 2010.
3.R Bose, “Information Theory, Coding and Cryptography”, TMH 2007
4.Fred Halsall, “Multidedia Communications: Applications, Networks, Protocols and Standards”,
Perason Education Asia, 2002
5.K Sayood, “Introduction to Data Compression” 3/e, Elsevier 2006
6.S Gravano, “Introduction to Error Control Codes”, Oxford University Press 2007
7.Amitabha Bhattacharya,“Digital Communication”, TMH 2006
University of Illinois
Department of Computer science
Topics: Entropy and differential entropy, mutual information; data compression; channel capacity, the
Gaussian channel; rate-distortion; universal-source-coding; network information theory. Contemporary
examples and research topics.