0% found this document useful (0 votes)
44 views

Example: A Binary System Produces Marks With Probability of 0.7 and Spaces With Probability 0.3, 2/7 of The Marks Are Received in Error and 1/3 of The Spaces. Find The Information Trans-Fer

The document describes information theory concepts like information transfer, equivocation, and channel capacity for a binary system with Marks and Spaces that have probabilities of being transmitted incorrectly. It provides formulas to calculate the information transfer, equivocation (information loss due to noise), and the maximum information transfer (channel capacity) for a binary symmetric channel where the error probability is the same for 0s and 1s. Examples are given to demonstrate calculating these values numerically.

Uploaded by

Reagan Torbi
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
44 views

Example: A Binary System Produces Marks With Probability of 0.7 and Spaces With Probability 0.3, 2/7 of The Marks Are Received in Error and 1/3 of The Spaces. Find The Information Trans-Fer

The document describes information theory concepts like information transfer, equivocation, and channel capacity for a binary system with Marks and Spaces that have probabilities of being transmitted incorrectly. It provides formulas to calculate the information transfer, equivocation (information loss due to noise), and the maximum information transfer (channel capacity) for a binary symmetric channel where the error probability is the same for 0s and 1s. Examples are given to demonstrate calculating these values numerically.

Uploaded by

Reagan Torbi
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Example: A binary system produces Marks with

probability of 0.7 and Spaces with probability


0.3, 2/7 of the Marks are received in error and
1/3 of the Spaces. Find the information trans-
fer.

p(M, M ) = p(M ) × p(M |M ) = 0.7 × 5/7 = 0.5


p(S, S) = p(S) × p(S|S) = 0.3 × 2/3 = 0.2
p(M, S) = p(M ) × p(S|M ) = 0.7 × 2/7 = 0.2
p(S, M ) = p(S) × p(M |S) = 0.3 × 1/3 = 0.1

1
p(y = M ) = p(M, M ) + p(S, M )
= 0.5 + 0.1 = 0.6
p(y = S) = p(S, S) + p(M, S)
= 0.2 + 0.2 = 0.4

Use
p(xi, yj )
I(xi; yj ) = log2
p(xi)p(yi)
to calculate mutual information for X = {M, S},
Y = {M, S}.
0.2 ) = −0.4854
I(X = M, Y = S) = log2( 0.7×0.4
0.5 ) = 0.2515
I(X = M, Y = M ) = log2( 0.7×0.6
0.2 ) = 0.7370
I(X = S, Y = S) = log2( 0.3×0.4
0.1 ) = −0.8480
I(X = S, Y = M ) = log2( 0.3×0.6

2
Use
XX
I(X; Y ) = p(xi, yj )I(xi; yj )
X Y
to calculate average mutual information

I(X; Y ) = 0.2 × (−0.4854) + 0.5 × 0.2515 +


0.2 × 0.7370 + 0.1 × (−0.8480)
= 0.091bits/symbol

p(y|x)p(x)
To calculate p(x|y), p(x|y) = p(y)
was ap-
plied.

3
•Equivocation

Equivocation represents the destructive effect


of noise, or additional information needed to
make the reception correct. Looking at the
diagram below, the observer looks at the trans-
mitted and received digits; if they are the same,
reports a 1, if different a 0.

so that

4
The information sent by the observer is easily
evaluated as −[p(0) log2 p(0) + p(1) log2 p(1)]
applied to the binary string generated by the
observer. The probability of 0 is just the chan-
nel error probability.

Example: A binary system produces Marks and


Spaces with equal probabilities, 5% of all pulses
being received in error. Find the information
sent by the observer.

The information sent by observer is

−[0.95 log2(0.95)+0.05 log2(0.05)] = 0.2864bits


since the input information is 1 bit/symbol, the
net information is 1 − 0.2864 = 0.7136 bits,
agreeing with previous results. This means
that the noise in the system has destroyed
0.2864 bits of information.

5
The general expression of the information trans-
fer using equivocation (information loss) is there-
fore
I(X; Y ) = H(X) − H(X|Y ) = H(Y ) − H(Y |X)
where
XX
H(Y |X) = − p(xi, yj ) log2 p(yj |xi)
X Y
XX
H(X|Y ) = − p(xi, yj ) log2 p(xi|yj )
X Y
H(X) = − X p(xi) log2 p(xi) is the source en-
P

tropy, or the information transfer for a noise-


less channel.

H(Y ) = − y p(yj ) log2 p(yj ) is the receiver en-


P

tropy.
Venn diagram of the formulae

6
A summary of basic formulae

7
•Channel capacity

The channel capacity is the maximum infor-


mation transfer of the channel

C = max I(X, Y )
For a Binary Symmetric Channels (BSC) (This
is characterized by a single value p of binary er-
ror probability, that is the probabilities of errors
in 0 and 1 is the same. )

The information transfer

I(X; Y )
= H(X) − H(X|Y )
= H(Y ) − H(Y |X)
XX
= H(Y ) + p(xi, yj ) log2 p(yj |xi)
X Y
X X
= H(Y ) + p(xi)p(yj |xi) log2 p(yj |xi)
X Y

8
Continue ...
I(X; Y )
XX
= H(Y ) + p(xi )p(yj |xi ) log2 p(yj |xi )
X Y
X X
= H(Y ) + p(xi ) p(yj |xi ) log2 p(yj |xi )
X Y
X
= H(Y ) + p(xi )[p log2 p + (1 − p) log2 (1 − p)]
X
X
= H(Y ) + [p log2 p + (1 − p) log2 (1 − p)] p(xi )
X
= H(Y ) + [p log2 p + (1 − p) log2 (1 − p)]
= H(Y ) − H(p)

where H(p) = −[p log2 p + (1 − p) log2(1 − p)].

p is fixed, so I(X, Y ) is maximum when H(y)


is maximum. This occurs when p(0) = p(1) at
receiver (output) H(y) = 1.

The channel capacity

C = 1 − H(p)

9
For example, if p = 0.125, then
H(p) = −[p log2 p + (1 − p) log2(1 − p)]

= −[0.125 log2 0.125 + 0.875 log2 0.875]

= 0.54 bits

C = 1 − H(p) = 0.46 bits


0.9
1−H(p)
0.8

0.7

0.6

0.5
I(xy)

0.4

0.3

0.2

0.1

−0.1
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

p(0)

Variation of information transfer with input prob-


ability. Mutual information increases as error
rate decreases.
10
1

0.8

0.6
C

0.4

0.2

0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1
error probability p

Variation of capacity with error probability. For


fixed p, mutual information decreases as re-
ceiver entropy H(Y ) decreases.

11

You might also like