II-II Ece Acs-Unit 5
II-II Ece Acs-Unit 5
COURSE MATERIAL
ANALOG COMMUMNICATIONS
SUBJECT (19A04403T)
UNIT 5
COURSE B.TECH
SEMESTER 22
Version V-1
1|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
TABLE OF CONTENTS – UNIT 1
S. NO CONTENTS PAGE NO.
1 COURSE OBJECTIVES 3
2 PREREQUISITES 3
3 SYLLABUS 3
4 COURSE OUTCOMES 3
5 CO - PO/PSO MAPPING 4
6 LESSON PLAN 4
7 ACTIVITY BASED LEARNING 4
8 LECTURE NOTES 4
5.1 INTRODUCTION 4
5.2 INFORMATION 5
5.3 INFORMATION AND UNCERTAINITY 6
5.4 ENTROPY 7
5.5 INFORMATION RATE 8
5.6 JOINT AND CONDITIONAL ENTROPY 8
5.7 MUTUAL INFORMATION 10
5.8 DISCREET MEMORYLESS CHANNEL 11
5.9 TYPES OF CHANNELS 12
5.10 CHANNEL CAPASITY 14
5.11 CHANNEL CODING THEOREM 15
5.12 CHANNEL CAPACITY THEOREM 18
5.13 SOURCE CODING THEOREM 20
9 PRACTICE QUIZ 24
10 ASSIGNMENTS 26
11 PART A QUESTIONS & ANSWERS (2 MARKS QUESTIONS) 26
12 PART B QUESTIONS 27
13 SUPPORTIVE ONLINE CERTIFICATION COURSES 28
14 REAL TIME APPLICATIONS 28
15 CONTENTS BEYOND THE SYLLABUS 29
16 PRESCRIBED TEXT BOOKS & REFERENCE BOOKS 29
17 MINI PROJECT SUGGESTION 30
2|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
1. Course Objectives
The objectives of this course is to
1. To introduce various modulation and demodulation techniques of analog
communication system.
2. To analyze different parameters of analog communication techniques.
3. Know Noise Figure in AM & FM receiver systems.
4. Understand Function of various stages of AM, FM transmitters and Know
Characteristics of AM & FM receivers.
5. Understand the concepts of information theory.
2. Prerequisites
Students should have knowledge on
1. Probability
2. random processes
3. Syllabus
UNIT V
Information Theory: Introduction, Information and Entropy, and its properties, source
coding Theorem, Data Compaction – Prefix coding, Huffman coding, Discrete
Memory less channels, Mutual Information, and its properties, Channel capacity,
Channel coding Theorem, Application to binary symmetric channels, differential
entropy and mutual information, Information capacity theorem, implication of
information capacity theorem, Rate Distortion, Illustrative problems.
4. Course outcomes
1. Understand the concepts of various Amplitude, Angle and Pulse Modulation
schemes. Understand the concepts of information theory with random processes.
(L1).
2. Apply the concepts to solve problems in analog and pulse modulation schemes.
(L2)
3. Analysis of analog communication system in the presence of noise. (L3)
4. Compare and contrast design issues, advantages, disadvantages and limitations
of various modulation schemes in analog communication systems.(L4)
5. Solve basic communication problems & calculate information rate and channel
capacity of a discrete communication channel (L5).
3|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
CO2 3 3 2 2
CO3 3 3 2 2
CO4 3 3 2 2
CO5 3 3 2 2
6. Lesson Plan
BTECH_MEC-SEM 31
SVCE TIRUPATI
In the context of communication, information theory deals with mathematical
modeling and analysis of communication rather than with physical sources and
physical channels.
In particular, it provides answers to two fundamental questions,
i. What is the irreducible complexity below which a signal cannot be
compressed?
ii. What is the ultimate transmission rate for reliable communication over a
noisy channel?
The answers to these questions lie on the entropy of the source and capacity of a
channel respectively.
• Entropy is defined in terms of the probabilistic behavior of a source of
Information.
5.2 INFORMATION
Information is the source of a communication system, whether it is analog or
digital communication. Information is related to the probability of occurrence of
the event. The unit of information is called the bit.
Based on memory, information source can be classified as follows:
i. A source with memory is one for which a current symbol depends on the
Previous symbols.
ii. A memory less source is one for which each symbol produced is
independent of the previous symbols.
iii. A Discrete Memory less Source (DMS) can be characterized by list of
symbols, the probability assignment to these symbols, and the specification
of the rate of generating these symbols by the source.
Consider a probabilistic experiment involves the observation of the output
emitted by a discrete source during every unit of time (Signaling Interval). The
source output is modeled as a discrete random variable (S), which takes on
symbols from a fixed finite alphabet.
S= { S0,S1,S2,… ,SK-1}
With probabilities,
P(S=sk)= pk , k=0,1,2,…….,K-1.
5|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
We assume that the symbols emitted by the source during successive signaling
intervals are statistically independent.
A source having the alone described properties is called a Discrete Memory less
Source (DMS).
i. Sun rises in East: Here Uncertainty is zero there is no surprise in the statement.
The probability of occurrence is 1 (pk=1).
ii. Sun does not rise in East: Here uncertainty is high, because there is maximum
surprise, maximum information is not possible.
5.3.1. Properties
I. Even if we are absolutely certain of the outcome of an event, even
before it occurs, there is no information gained.
III. That is, the less probable an event is, the more information we gain
when itoccurs.
6|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
IV. This additive property follow from the logarithmic definition of I(Sk).
7|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
Where K is the number of symbols in the alphabet S.
Property 1: H(S) = 0, if, and only if, the probability Pk = 1 for some k, and
the remaining probabilities in the set are all zero; this lower
bound on entropy corresponds to no uncertainty.
Property 2: H(S) = log K, if, and only if, Pk = 1/K for all k (i.e., all the
symbols in the source alphabet are equiprobable); this
upper bound on entropy corresponds to maximum
uncertainty.
Where,
H(X)- Average uncertainty of channel input
H(Y)- Average uncertainty of channel output
H(X/Y) and H(Y/X) –Conditional Entropy
H(X,Y)- Joint Probability
P(Xi)-Input Probability
P(Yi)-Output Probability
P(Xi/Yi),P(Yi/Xi)- Conditional Probability and
8|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
P(Xi,Yi)- Joint Probability
Relationship between Conditional and Joint Entropy
1. H(X, Y)= H(X/Y)+H(Y)
2. H(X, Y)=H(Y/X)+H(x)
Proof
6.
9|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
10|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
B) Expansion of The Mutual Information Property
The mutual information of a channel is related to the joint entropy of
the channel input and channel output by
11|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
Even that the channel input X=xi occurs with probability,
The above equation states that gives the input, a prior probabilities p(xi) and
channel matrix p(yj,xi) we may calculate the probabilities of output symbols p(yj) .
5.9. TYPES OF CHANNELS
LOSSLESS CHANNELS
A channel described by a channel matrix with only one non-zero element in
each column is called a lossless channel. The channel matrix of a lossless channel
will be like,
DETERMINISTIC CHANNEL
A channel described by a channel matrix with one non-zero
element ineach row is called a deterministic channel.
12|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
NOISELESS CHANNEL
A channel is called as noiseless channel if it is both lossless and deterministic. The
channel matrix of the noiseless channel is as follows
The entropy h(x) is maximized when the channel input probability p(x0)=p(x1)=1/2,
where x0 and x1 are each 0 or 1.
The Mutual Information I(X:Y) is similarly maximized, so that,
13|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
Capacity of Binary Symmetric Channel (BSC) is
14|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
The channel capacity c is a function only of the transition probabilities P(Yj|Xi),
which defines the channel. The calculation of channel capacity (c) involves
maximization of average mutual information I(X:Y) over k variables
P(Xi) ≥ 0 for all K
and ∑ ( )=1
The transmission efficiency (or) channel efficiency is expressed as,
η=
( ∶ )
η=
( ( ∶ )
( ∶ )
η=
The Redundancy of the channel is expressed as,
R=1
( ∶ )
R=
5.11. CHANNEL CODING THEOREM
The design goal of channel coding is to increase the resistance of a digital
communication system to channel noise. Specifically, channel coding consists of
mapping the incoming data sequence into a channel input sequence and
inverse mapping the channel output sequence into an output data sequence in
such a way that the overall effect of channel noise on the system is minimized.
Mapping operation is performed in the transmitter by a channel encoder,
whereas the inverse mapping operation is performed in the receiver by a
channel decoder, as shown in the block diagram of figure. For simplification
source encoding (before channel encoding) and source decoding (after
channel decoding) process was not included in this fig
15|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
There exists a coding scheme for which the source output can be transmitted
over the channel and be reconstructed with an arbitrarily small probability of
error. The parameter C/TC is called the critical rate.
When the system is said to be signalling at the critical rate then it need to satisfy
the below condition
II) Conversely, if
It is not possible to transmit information over the channel and reconstruct it with
an arbitrarily small probability of error.
The ratio Tc/Ts equals the code rate of encoder denoted by ‘r’, where
Conversely, it is not possible to find such a code if the code rate ‘r’ is
greater than the channel capacity C. If r ≤ C, there exists a code capable of
achieving an arbitrarily low probability of error.
Limitations
i. It does not show us how to construct a good code.
ii. The theorem does not have a precise result for the probability of symbol
error after decoding the channel output.
MAXIMUM ENTROPY FOR GAUSSIAN CHANNEL
Probability density function of Gaussian Channel / function is
BTECH_MEC-SEM 31
SVCE TIRUPATI
The maximum entropy is
17|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
From the above results we can observe the entropy is maximum when P(x)is zero
mean probability density function.
5.12. CHANNEL (INFORMATION) CAPACITY THEOREM OR SHANNON’S THEOREM
The information capacity (C) of a continuous channel of bandwidth B hertz,
affected by AWGN of total noise power with in the channel bandwidth N, is given
by the formula,
18|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
PROOF
Let us consider the channel or information capacity (C) as the difference
between the entropies of channel output and the total noise power within the
channel bandwidth.
If the signal is band limited, it is sampled at nyquist rate given as 2B, then
19|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
If we substitute the power spectral density of noise as N=N0B, then the channel or
information capacity will be,
TRADE OFF
I. Noiseless channel has infinite bandwidth. If there is no noise in the channel , then
N=0 hence (S/N)= . Then,
II. Infinite bandwidth has limited capacity. If b is infinite, channel capacity is limited,
because noise power (N) increases, (S/N) ratio decreases. Hence, even B
approaches infinity, capacity does not approach infinity.
5.13. SOURCE CODING THEOREM
Efficient representation of data generated by a discrete source is accomplished
by some encoding process. The device that performs the representation is called
a source encoder.
Efficient source encoder must satisfy two functional requirements
I) code nodes produced by the encoder are in binary form.
II)The source code is uniquely decodable, so that the original source sequence
can be reconstructed perfectly from the encoded binary sequence.
where,
L - Average number of bits per source symbol
p - Probability of the symbols
I - No. of bits per message
20|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
Coding Efficiency(η)
The value of coding efficiency (Ƞ) is always lesser or equal to 1. The
source encoder is said to be efficient when Ƞ approaches unity.
The two popular coding technique used in Information theory are (i) Shannon-
Fano Coding and (ii) Huffman Coding. The coding procedures are explained
in the following section.
2. Partition the set into two sets that are close to equiprobable as possible
and assign 0 to the upper set and assign 1 to the lower set.
21|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
3. Repeat the steps 2 and 3 for each part, until all the symbols are split into
individual subgroups.
4. Assign code for each symbol using the binary values obtained in each
stages.
22|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
2.The two messages of lowest probabilities are assigned 0 & 1. i.e, combine the
probabilities of two symbols having the lowest probabilities, and reorder
resultant probabilities. This step is called resultant 1.
3. The same procedure is repeated until there are two ordered probabilities
remaining.
23|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
4. Start encoding with the last reduction, which exactly consists of two ordered
probabilities. (i) Assign 0 as the first digit in the code words, for all the source
symbols associated with the first probability. (ii) Assign 1 to the second
probability.
24|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
9. Practice Quiz
1. The capacity of Gaussian channel is?
a) C = 2B log2 (1+S/N) bits/s
b) C = 2B log2 (1-S/N) bits/s
c)C = B log2 (1+S/N) bits/s
d) C = 2B log2 (1-S/2N) bits/s
2. For M equally likely messages, the average amount of information H is?
a)H = log10M
b) H = log2M
c)H = log10M2
d)H = 2log10M
3. The capacity of a binary symmetric channel, given H(P) is binary entropy function is?
a)1 - H(P)
b) H(P) - 1
c)1 - H(P)2
d) H(P)2 - 1
4. For M equally likely messages, M>>1, if the rate of information R ≤ C, the Probability of
error is?
a) Small
b)Close to unity
c) Not predictable
d) unknown
5. according to Shannon Hartley theorem?
a)The channel capacity becomes infinite with infinite bandwidth
b)The channel capacity does not become infinite with infinite bandwidth
c)Has a tradeoff between bandwidth and Signal to noise ratio
d)Both b and c are correct
6. Which type of cutting tools have wide application on lathes?
a) single point
b) multi point
c) both single point and multi point
d) none
7. Which of the following is the example of multi point cutting tool?
a) milling cutter
b) broaching tool
c) both milling utter and broaching tool
d) none
25|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
9. Which of the following is an example of non-cutting shaping process?
a) Turning
b) Forging
c) Drilling
d) Milling
10. Which of the following is the example of cutting shaping process?
a) Knurling
b) Forging
c) Pressing
d) Drawing
11. The process of metal cutting is affected by the relative motion between the piece of
work and the hard type edge of a cutting tool against the work piece.
a) True
b) False
12. In how many groups, various metal working processes can be classified?
a) 2
b) 3
c) 4
d) 8
10. Assignments
S.No Question BL CO
1 Give the significance of Shannon’s channel capacity theorem. 2 1
Consider the BSC with P(x1)= as shown in fig. Show that mutual
information is I(X,Y)=H(Y)+p log p+(1-p)log(1-p)
(1-p)
2 2 1
p
p
(1-p)
BTECH_MEC-SEM 31
SVCE TIRUPATI
i.e., Entropy, H =(Total information)/ Number of messages.
2 What is channel redundancy?
Ans. Redundancy (γ) = 1 – code efficiency 1 1
Redundancy should be as low as possible.
3 Name the two source coding techniques?
Ans. source coding techniques are a) prefix coding 1 1
b) Shannon-fano coding c) Huffman coding
4 State any four properties of entropy?
Ans. i) For sure event or impossible event entropy is zero.
ii)For M number of equally likely symbols, entropy is log 2 M
1 1
iii) Upper bound on entropy is H max = log 2 M
iv) Entropy is lower bound on average number of bits per
symbol
5 State the channel coding theorem for a discrete memory less
channel?
Ans. Statement of the theorem:
Given a source of ‘M’ equally likely messages, with M >>1,
which is generating information at a rate.
1 1
Given channel with capacity C. Then if, R ≤ C There exists
a coding technique such that the output of the source
may be transmitted over the channel with a probability of
error in the received message which may be made
arbitrarily small.
6 What is memory less source? Give an example?
Ans. The alphabets emitted by memory less source do not
depend upon previous alphabets. Every alphabet is 1 1
independent. For example a character generated by
keyboard represents memory less source.
7 Explain the significance of the entropy H(X/Y) of a
communication system where X is the transmitter and Y is the
receiver?
Ans. a)H(X/Y) is called conditional entropy. It represents
uncertainty of X, on average, when Y is known. 2 1
b)In other words H(X/Y) is an average measure of
uncertainty in X after Y is received.
c) H(X/Y)Represents the information lost in the noisy
channel
8 What is prefix code?
Ans. In prefix code, no codeword is the prefix of any other
codeword. It is variable length code. The binary digits 2 1
(code words) are assigned to the messages as per their
probabilities of occurrence.
9 What is information theory?
Ans. Information theory deals with the mathematical modelling
2 1
and analysis of a communication system rather than with
physical sources and physical channels.
10 Define bandwidth efficiency?
2 1
Ans. The ratio of channel capacity to bandwidth is called
27|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
bandwidth efficiency.
i.e, Bandwidth efficiency= Channel capacity (C)/Bandwidth(B).
S.No Question BL CO
1 Derive Shannon’s channel capacity theorem 1 1
2 A zero memory source emits messages x1, x2, x3 with 2 1
probabilities 0.45,0.35 and 0.20 respectively. Find the optimum
(Huffman) binary code for this source as well as for its 2ndOrder
extension (i.e. N=2). Determine code efficiency in each case
3 Write short note on measure of information and entropy 2 1
4 A discrete memoryless source has an alphabet of seven symbols 3 1
with probabilities 0.25,0.25,0.125,0.125,0.125,0.0625,0.0625
resp. Compute Shannan_fano code for this source, calculate
entropy, avg. codeword length & variance of this code
5 Consider a binary input ,output channel shown below 3 1
0.8
X1 Y1
0.2
1
X2 Y2
0.3
X3 0.7
Y3
Find H(X),H(Y),H(X|Y),H(Y|X)and H(XY)
S.No Application CO
1 Intelligence uses and secrecy applications 5
Information theoretic concepts apply to cryptography and cryptanalysis.
Turing's information unit was used in the Ultra project, breaking the
German Enigma machine code and hastening the end of World War II in
Europe. Shannon himself defined an important concept now called the
unicity distance.
2 Miscellaneous applications 5
28|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
Information theory also has applications in Gambling and information
theory, black holes and bioinformatics.
3 Semiotics 5
Semioticians DoedeNauta and Winfried Nöth both considered Charles
Sanders Peirce as having created a theory of information in his works on
semiotics. Nauta defined semiotic information theory as the study of "the
internal processes of coding, filtering, and information processing."
4 Seismic Exploration 5
One early commercial application of information theory was in the field
of seismic oil exploration. Work in this field made it possible to strip off and
separate the unwanted noise from the desired seismic signal. Information
theory and digital signal processing offer a major improvement of
resolution and image clarity over previous analog methods.
5 Pseudorandom Number Generation 5
Pseudorandom number generators are widely available in computer
language libraries and application programs. They are, almost
universally, unsuited to cryptographic use as they do not evade the
deterministic nature of modern computer equipment and software. A
class of improved random number generators is termed
cryptographically secure pseudorandom number generators, but even
they require random seeds external to the software to work as intended.
These can be obtained via extractors, if done carefully.
A machine tool is a valuable piece of equipment for any small or large business.
Cared for properly, your machine can last for years, but if your machine tool is
neglected, it can add up to a number of costly machine tool repairs. If you want
to extend the life of your equipment, maintenance of Machine tools are very
much essential.
29|M T - U N I T - I
BTECH_MEC-SEM 31
SVCE TIRUPATI
References:
1. Simon Haykin, “Communication Systems”, Wiley-India edition, 3rd edition, 2010.
2. Herbert Taub& Donald L Schilling, “Principles of Communication Systems”, Tata
McGraw-Hill, 3rd Edition, 2009.
30|M T - U N I T - I
BTECH_MEC-SEM 31