Channel Capacity
Channel Capacity
4 / CHANNEL CAPACITY 91
Data
transmitted: 1 0 1 0 0 1 1 0 0 1 1 0 1 0 1
Signal:
Noise:
Signal plus
noise:
Sampling
times:
Data received: 1 0 1 0 0 1 0 0 0 1 1 0 1 1 1
Original data: 1 0 1 0 0 1 1 0 0 1 1 0 1 0 1
Bits in error
Figure 3.16 Effect of Noise on a Digital Signal
communication. For example, a sharp spike of energy of 0.01 s duration would not
destroy any voice data but would wash out about 560 bits of digital data being trans-
mitted at 56 kbps. Figure 3.16 is an example of the effect of noise on a digital signal.
Here the noise consists of a relatively modest level of thermal noise plus occasional
spikes of impulse noise. The digital data can be recovered from the signal by sam-
pling the received waveform once per bit time. As can be seen, the noise is occasion-
ally sufficient to change a 1 to a 0 or a 0 to a 1.
We have seen that there are a variety of impairments that distort or corrupt a signal.
For digital data, the question that then arises is to what extent these impairments
limit the data rate that can be achieved. The maximum rate at which data can be
transmitted over a given communication path, or channel, under given conditions, is
referred to as the channel capacity.
There are four concepts here that we are trying to relate to one another.
• Data rate: The rate, in bits per second (bps), at which data can be com-
municated
92 CHAPTER 3 / DATA TRANSMISSION
Nyquist Bandwidth
To begin, let us consider the case of a channel that is noise free. In this environ-
ment, the limitation on data rate is simply the bandwidth of the signal. A formu-
lation of this limitation, due to Nyquist, states that if the rate of signal
transmission is 2B, then a signal with frequencies no greater than B is sufficient
to carry the signal rate. The converse is also true: Given a bandwidth of B, the
highest signal rate that can be carried is 2B. This limitation is due to the effect of
intersymbol interference, such as is produced by delay distortion. The result is
useful in the development of digital-to-analog encoding schemes and is, in
essence, based on the same derivation as that of the sampling theorem, described
in Appendix F.
Note that in the preceding paragraph, we referred to signal rate. If the signals
to be transmitted are binary (two voltage levels), then the data rate that can be sup-
ported by B Hz is 2B bps. However, as we shall see in Chapter 5, signals with more
than two levels can be used; that is, each signal element can represent more than one
bit. For example, if four possible voltage levels are used as signals, then each signal
element can represent two bits. With multilevel signaling, the Nyquist formulation
becomes
C = 2B log2 M
EXAMPLE 3.3 Consider a voice channel being used, via modem, to transmit
digital data. Assume a bandwidth of 3100 Hz. Then the Nyquist capacity, C, of the
channel is 2B = 6200 bps. For M = 8, a value used with some modems, C
becomes 18,600 bps for a bandwidth of 3100 Hz.
signal power
SNR dB = 10 log10
noise power
This expresses the amount, in decibels, that the intended signal exceeds the noise
level. A high SNR will mean a high-quality signal and a low number of required
intermediate repeaters.
The signal-to-noise ratio is important in the transmission of digital data
because it sets the upper bound on the achievable data rate. Shannon’s result is that
the maximum channel capacity, in bits per second, obeys the equation
C = B log 211 + SNR2 (3.1)
where C is the capacity of the channel in bits per second and B is the bandwidth of
the channel in Hertz. The Shannon formula represents the theoretical maximum
that can be achieved. In practice, however, only much lower rates are achieved. One
reason for this is that the formula assumes white noise (thermal noise). Impulse
noise is not accounted for, nor are attenuation distortion or delay distortion. Even in
10
Some of the literature uses SNR; others use S/N. Also, in some cases the dimensionless quantity is
referred to as SNR or S/N and the quantity in decibels is referred to as SNR db or 1S/N2db . Others use just
SNR or S/N to mean the dB quantity. This text uses SNR and SNR db .
94 CHAPTER 3 / DATA TRANSMISSION
an ideal white noise environment, present technology still cannot achieve Shannon
capacity due to encoding issues, such as coding length and complexity.
The capacity indicated in the preceding equation is referred to as the error-free
capacity. Shannon proved that if the actual information rate on a channel is less than
the error-free capacity, then it is theoretically possible to use a suitable signal code to
achieve error-free transmission through the channel. Shannon’s theorem unfortu-
nately does not suggest a means for finding such codes, but it does provide a yardstick
by which the performance of practical communication schemes may be measured.
Several other observations concerning the preceding equation may be instructive.
For a given level of noise, it would appear that the data rate could be increased by
increasing either signal strength or bandwidth. However, as the signal strength increases,
so do the effects of nonlinearities in the system, leading to an increase in intermodula-
tion noise. Note also that, because noise is assumed to be white, the wider the band-
width, the more noise is admitted to the system.Thus, as B increases, SNR decreases.
EXAMPLE 3.4 Let us consider an example that relates the Nyquist and Shan-
non formulations. Suppose that the spectrum of a channel is between 3 MHz and
4 MHz and SNR dB = 24 dB. Then
EXAMPLE 3.5 For binary phase-shift keying (defined in Chapter 5), Eb/N0 =
8.4 dB is required for a bit error rate of 10-4 (one bit error out of every 10,000). If
the effective noise temperature is 290°K (room temperature) and the data rate is
2400 bps, what received signal level is required?
We have
8.4 = S1dBW2 - 10 log 2400 + 228.6 dBW - 10 log 290
= S1dBW2 - 110213.382 + 228.6 - 110212.462
S = - 161.8 dBW
S
= 2 C>B - 1
N
Using Equation (3.2), and equating R with C, we have
= 12 C>B - 12
Eb B
N0 C
This is a useful formula that relates the achievable spectral efficiency C/B to Eb/N0 .
There are many books that cover the fundamentals of analog and digital transmission.
[COUC01] is quite thorough. Other good reference works are [FREE05], which includes
some of the examples used in this chapter, and [HAYK01].
COUC01 Couch, L. Digital and Analog Communication Systems. Upper Saddle River, NJ:
Prentice Hall, 2001.
FREE05 Freeman, R. Fundamentals of Telecommunications. New York: Wiley, 2005.
HAYK01 Haykin, S. Communication Systems. New York: Wiley, 2001.
Key Terms