Open navigation menu
Close suggestions
Search
Search
en
Change Language
Upload
Sign in
Sign in
Download free for days
0 ratings
0% found this document useful (0 votes)
34 views
Shanon Encoding and Fano Encoding, Theorem, Problems On Entropy
Uploaded by
Manoj Naik
AI-enhanced title
Copyright
© © All Rights Reserved
Available Formats
Download as PDF or read online on Scribd
Download now
Download
Save Shanon Encoding and Fano Encoding,Theorem,Problems... For Later
Download
Save
Save Shanon Encoding and Fano Encoding,Theorem,Problems... For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
0 ratings
0% found this document useful (0 votes)
34 views
Shanon Encoding and Fano Encoding, Theorem, Problems On Entropy
Uploaded by
Manoj Naik
AI-enhanced title
Copyright
© © All Rights Reserved
Available Formats
Download as PDF or read online on Scribd
Download now
Download
Save Shanon Encoding and Fano Encoding,Theorem,Problems... For Later
Carousel Previous
Carousel Next
Save
Save Shanon Encoding and Fano Encoding,Theorem,Problems... For Later
0%
0% found this document useful, undefined
0%
, undefined
Embed
Share
Print
Report
Download now
Download
You are on page 1
/ 25
Search
Fullscreen
of a Mark’ canple 143: These hence Sesouce (i) Find G, 204 a 2 1, SD} a " sig, 18: Marka Sure for Example 143 oa ofthe given Markov source are given by jolation +The state equa 5 Pay = grt FO) for state 1: 2 PQ) = FPO+5PO for state 2: And: Pd)+PQ) =1 1 ‘From equation (1.77), Po 3PQ) : ve) = 3a) ‘Using equation (1.8¢ 79) eye 3) = 1 ve (l a gO =1 (9 From equation (140), the entropy ofthe I sta ; | 7 Pulog 5 +, log te wig 4 yl 4185 jlo 4H = 08113 bite/m-sym” Gi Fi second symbol interval. state) t , And i- Ps log = v7 1 = Py log 5 +Pnlos 5 Byes = 7 l0e3 +5 log). H,= 0.9183 bits/message-symbol From equation (1.41), the entropy H of the source is given by 2 H= DOH fot = P()H,+P@)H, 4 3 = (3) 0.8113) +) (0.9183) -.11=0.8572 bits/message-symbol .16 shows the tree corresponding to ini 1 state and the state at the beginning of the v a Intl sate ; Saies atthe ead of 2° symbol interval ‘Tree diagram for Markov source of example 143 From equation (1.43) with N= at the start of the 1* symbol interval equal to initial~~) oOo oo) & { Lal pH) 1 no 5 log 2 += soe G.= ‘ ae Want sda Comparing Hand G, values, we conclude that G,>H Again from equation (1.43), with N = 2 (ie. at the start of the 2>4 symbol interval), we have, I G, = SHS) di Slog? 7 2 8s os = log 742 7 log7+= joe 2] G,= sre oie Agua comparing values of H, G, and G,, we conclude that, G, > G,>H L7 REVIRW aroma. 0Example 1.35 : For the first order Markoff model shown in figure 1.13 probeblities, entropy ofeach state and the entropy ofthe source, , os - a C 03 Gx) 02 -) aS Te oe 08 08 Fig. 1.13 : Markoff model for example 1.35 Solution From the Markoff model of figure 1.13, the state equations are given by forstateA > P(A) = 0.2 P(A)+0.2 P(C) forstateB > P(B) = 0.2 P(B) +03 PC) forstatleC > P(C) = 08 P(A) +0.8 P(B) + 0.5 P(C) From equation (1.64), 0.8 P(A) = 0.2 P(C) FA) = 5 A From equation (1.65) 08 P(B) = 0.3 P(C) 3 5 PB)= 5h f ‘We have P(A) + PB) + P(C) = 1 pee) GPOF SPO +PO =1 24348 ro] oH Fo - From equation (167) Find ty raz From equation (1.68), P(B) = (3) PC) -(3) (3) 3 ’B- 5 RAS. of equation (1.66) is 0.8 P(A) + 0.8 P(B) +0.5 P(C) = (08) (3) + (08)| (3) +(05) i) 8 "3 = P(C) = L.HS of equation (1.66) —> Verified. ‘i. The state probabilities are given by 2 3 8 P(A)= 75, PB)= 35 and PO)= 75 From equation (1.40), the entropy of each state is given by 1 Jog — uP €. 1 H, = Yypajlog — Forstate A, fly Pil : 1 patopelaea ey = Paaloe > + Pan ea Packt oe 1 1 = 02 log 55 +0+ 08108 og HL, = 0.722 bits/message symbol & L pon, ake 1 1 L 1 + pacloe = putea +P mloe | + PMP i he = 0402 og 55 +088 95 H, = 0.722 bilmessge 97m!Bs a et ti ny c 1 eae j ‘J au N For state C, logs hg ep log — Pca Po Pcs: Pes ec Pec 1 02 log = + 03 log 5+ 0.5 log 55 3 me H, = 1.485 bits/message symbol From equation (1.41), the entropy of the source fs given by yPQ) H, ixA P(A) H, + PB) H, + P(C) H, (5) (0.722) + (5) (0.722) + (3) (1.485) 1.192 bits/message symbol il H Mi i x "Une sumer ~o- SHANNON’S FIRST THEOREM (NOISELESS CODING THEOREM) Shannon suggested that the length ‘/? can be known using the formula 1 L= log teases (2.23) Equation (2.23) above signifies that larger the value of symbol probability smaller will be value of /, and a fewer encoding digits will be sufficient. Much before C.E. Shannon, Morse had this idea in his mind while formulating the famous “dot-dash” code. He knew that the letter ‘E’ in English alphabet occurs more frequently than the letter “Q’ and he encoded E as a single dot “.” and Q as “--.—” the longest code-word in the code. This same principle was applied in equation (2.23) by Shannon.BU serie a caloulated from equation (2.23), then it ig ‘Y roung ff bappens to bea fraction 25 sent ntger a give0 PY : 1 gyeitlog- 2 i Pi where L, = average length of code-words of n® extended source symbols. tog, § But from equation (1.35), we have sation (2.24) can be written as % oe HG) = nH) entropy of, atthe alene {ng in equation (2.30), we get Zs es BH) a(S) 7 log;t ~ “* log,T iding throughout by ‘n’ and using equation (2.19) again, equation (2.31) reduces to sing he property of logaitimss Mutipying throughout by p,and taking summation for all i varying from 1 t94 Divi by Lo u HS) st 5, HS) see 23) Deiltoss “0 , lows from the first inequality of equation (2.32) that tating limits 25 > - 1 a s From equation 2.16), P1082 =H(S) ‘When L is the average length of code-words for the basic source S, then for all values of ‘extension ‘n’, it is true, that (234)
— forall i=1,2,..44 j Pi . ee Step No. 4: ei ec } Expand the decimal number a, in binary from upto /; places neglecting exp beyond /; places. y Step No. 5: "Ce Remove the binary point to get the desired code. sple 2,12 : Apply Shannon’s encoding (binary) algorithm to the following: setofme and obtain code efficiency and redundancy. My gt m, m, m,; 012518 1/16 3/16 14 v8 Solution O62 OF ODT 9 3 FT Step No.1: The symbols are arranged according to non-increasing probabilities as below: m, m, m, m, m, 6/16 46 3/16 2/16 W116 Pe YS A ee tis oy Pp P, Wy % Ps Step No. 2: The following sequences of a's are computed a, = 0417 go Oe 216 1g 7 0.625 ~M,3 8 = 16 * 16 ~ 16 ~ 8125 28,28, : 95 = 16 + 16 = 16 7 99375 oe %~ 16" 16 Step No. 3: ‘The smallest integer value of J; is found using we 5 forall i=1,2,.,5 1 Fori=1—> 24 2 — PL c \ - Z 2 % -®) 9 or 24 > 2.66 The smallest value of J, which satisfied the above inequality is 2. ~ 9 Fori=2—> 24 aaa QQ. 2h 2! Fori=3 24 > sm ; eos :. smallest value of ,=3. 9 Fori= 4424 > + Sule Lg ‘ 2a 2@) v, smallest value of J, = 3. a ‘ 1 \ Fori= 5524 > Ps * 24216. +, smallest value of 1, = 4, ‘ot av aed | Step No. 4: fe | The decimal numbers a, are expanded in binary form upto J places as given i \ a,=0 a, = 0375), 4 0.375 x 2 = 0.75 with carry 0 0.75 x 2 = 0.50 with carry 1 | 0.50 x 2 = 0.00 with carry 1 a, = (0.375),, = (011), 0, = (0.625) 5 0.625 x 2 = 0.25 with carry 1 0.25 x 2 = 0.50 with carry 0 | 0.50 x 2 = 0.00 with carry 1 4G, = (0.625),, = (101), @, = (0.8125),, 0.8125 x 2 = 0.625 with carry 1 0.625 x 2 = 0.25 with carry 1 0.25 x 2 = 0.50 with carry 0 ns 0.50 x 2 = 0.00 with carry J ( ) a, = (0.8125), = (1101), (0.9375), 0.875 with carry 1 0.875 x2, = 0.75 with carry | 0.75 «2 = 0.50 with carry 1 § 0.50 x 2 = 0.00 with carry.f ©, = (0.9375). = (1111), 0.9375 x 2 s Step No. 5: é and [,=2 a, = (011), and J, =2 a, = (101), and J, =3 (1101), and J, =3 G11), and J,=4 +, code for s,—> L111 The code table can now be constructed as shown in table 2.17 below: ++ code for §, -> 00(( “1 code for s, -> Q1 “+ eode for's,—> 101 “+ code for s, > 110.sourcocoding 119 i ‘Source Symbols | ~ P Code [__4in binits m, 38 00 2 m, 4 01 2 m, 316 1 3 m, 18 1 3 m, 116 nid 4 Table 2.17 : Code table for example 2.12 By inspecting the code in table 2.1 7, We can come to the conclusion that all prefixes are absent and hence is an instantaneous code Satisfying the prefix property. The average length L is computed using equation (2.15) as 5 1 3 1 1 -(JJo- (3) o+(i6) o+(ZJo+ (ie) = 2.4375 binits/message-symbol. From equation (2.16), entropy a s 1 Xp; log— i Pi 3, 81 3 16 1 1 ae log; + 4 log4 + 16 log 3 ry log + 6 logl6 = 2.1085 bits/message-symbol 0 | H(S) “Code efficiency is calculated using equation (2.20) as _ H(S) _ 2.1085 Me 24375 = 0.865 _ 7 Percentage code efficiency = 86.5% Code Tedundancy is calculated using equation (2.22) as ; R. = 1-n,= 1-0.865 = 0.135 * Percentage code redundancy = 13.5% umple : it i 23 Apply Shannon’s encoding (binary) algorithm to the following messages 1 8, Ss, @ Ping 03 0.2 iy tee? efficiency and redundancy. : the og technique is applied to the 2" Cficien, order extension of this source, how much will \cy be improved?20 ‘Solution Siep No.1 For the basic source 5, 8 s 05 0302 peda t PB Pa Ps Step No.2: a, =0 a, = 05 a, = 0.5+03=08 a, = 08+0.2=1 ‘Step No.3: met Pi Fori=1—>24 z m2 1 binit 1 Fori=2->24 2 9- «2823.33 s.J,=2binits 1 Forin3 9% 2 225 + 1,=3 binits 3 Step No.4: 06x2 = 02 wi oxy | 0.2%2 = 04 with carry 0 n=. «+ eode for s, > 110 mn C08 -the code table for the basic source can now be written as shown in table 2.18, Source symbols By Code Tin binits 5 os ° 1 5 03 2 a 02 3 Table2.18 : Code-table of example 2.13, Average length, L = = 05) (1)+ 03) Q) +02) 8) = 1.7 binits/message-symbol : 1 Bntropy, H(S) = 2iPiloe- 5 a a a = 05 logy z +03 logs; +02 logs 5 = 1.4855 bits/message-symbol HS) _ 14855 «Code efficiency, n, = HS = MES - s7.8% , rode redundancy R, = 1 ~ n,= 1 ~ 0.8738 = 12.62% Gi) 2° Extension : The 2% extension of the basic source will have 3*= 9 symbols given by a SS 58, Sy SK 025° (01S 0101S. 009 0.060.100.0604 ‘Step No. 1: For 2 extended source, the non-increasing symbol order is sh 8h HA 025° (0.15 0.15 0.00.10 0.090.060.0608 + L + + + + + + + RM at nt, tee tea, Step No.2: ‘The a’s are calculated as 90, "0, a, =0.25; a,=04, 4,-0.5, 0065, a= 075, = 084 4-0 = 0.96, 0,,=1Sep Na 2 eng re cael 8B 1 we ts 22d +f, =2 binits Fori= 12242 gpg 2 Fori=2—>24 2 a «242667 -. 1,=3 binits Fori=3>25 2 9,5 «282667 Fori=4324 2 N10: 4 =4 ints Fari=5924 > 210 <= 4 binits 70 1 Fori= 69262 Gop MBL «Lom Dinits 1 Fori=7924 2 jog 2421667 -. = Sbinits Fori=8-»24 >
its code for s,s, > code for s,s, > code for s,s, > code for 5,5, > code fors,s,-> 1100 code for s,s, > 11010 ‘code for s,s, -> 11100 code fors,s, > 1110 a, = (O11..), and f, =3 bini124 he “code able forthe basi oure ca NOW be Wrtlen as shown intabey Source symbols Py Coie - 0.25 00 2 wi 0s 010 3 ols on 3 “TIO 1000 4 0 1010 4 0.09 1100 4 0.06 5 0.06 5 Ss 0.04 5 ‘Table 2.19 : Code table for 2nd extension of example 2.13 (i) ‘The average length L, is calculated as 2 =m = (0.25) (2) + (0.15) (3) + (0.15) (3) + (0.1) (4) + 0.1) (4) + (0.09) (4) + (0.06) (5) + (0.06) (5) + (0.04) (5) L, = 3.36 binits/message-symbol ‘The entropy of the 2 extended source is calculated as H(S?) = 2H(S) [from equation (1.33)) = 2 1.4855 H(S!) = 2.971 bits/message-symbol 2 +. Code efficiency 1,2 = = au = 88.42% :. Code redundancy R,®= 11.58% + Code efficiency improves by (88.42 ~ 87.38) = 1.04% Note : For the 2% extension, Ly _ 336 2 we have L, = 3.36 binits per message-symbol. + Consider > = =" = 1.68 binits/message-symbol and L = 1.706% Li : "J < L which satisfies equation (2.34), When we go in for 3 /message-symbol for basic source bye 3 2 efficiency goes on improving. When the ‘extension ‘4 ‘then we wi to achieve a code efficiency of 100%, 1 becomes in extension and calculate L,, we find that £3 < 42
2 ” 1 > Me og or 225 <. The smallest value of J, which satisfies the above inequ: ,=2 ren 1 "ori=2—9 24 > — Pa ele, m2 O25 2h 24-2, smallest value of = 2. —abe Fori=3 24 2 5 0h 2 667 +, smallest value of f, = 3. 1 Fori=4—>24 2 5~ 1 A he 0.12 2 8.33. <. smallest value of f= 4. 1 Fori= 5242 — Ps 2 1 We og Wri2s - Step No.4: ‘The decimal numbers 4, are expanded in-binary form upto /, places as given bel =o ee a, = (0. 0.42 = 0.8 with carry 0 08 x2 = 06 with carry 1 | 0.62 = 0.2.with cary 1 4 = 4), = (O11), a, = (0.65),y 065 x2 = 0.3 with carry 1 03% 2 = 0.6 with carry 0 | 0.6 x2 = 0.2with carry 1 4, = (0.65), = (101), 4, = 08), 0.82 = 0.6 with carry 1 0.62 = 0.2 with carry 1 | 0.2*2 = 0.4with cary 0 0.4 x2 = 0.8 with camry 0 1 = (0-8) = (1100), +. code for s, > 00 - code for s, —> 01 :. code for s, > 101 le for example 24 of smo fon an alphabet Step No.1: For the basic source E A D B c 56 a 36 18 wg 4 a pit + Sas » P Ps Py + Ps ‘Step No. 2: The following sequences of a's are computed a, =0$Y ang 5 = 2 = 0.3125 a, - 557° 5,123 295625 aiztede 9, = 7674" 16 ‘Step No. 3: The smallest oe -value of is found using 2h > + forall 1 1,2,3,4,5 Pi 1 Fori=1-9242 > & & B25: iS “ Fori=4-924 2 =~ a ai 4 integer for | =3 | ' r= 3, | \ Similarly, smallest value off, =3, a ° yy Step No. 4: i The deci i ; : imal numbers , are expanded in binary form upto {places as given belo™ a 120 a,=0 a, = (03125), ®& 03125 * 2 = 0.625 with carry 0 5 (0,625 x 2 = 0.25 with carry 1 | 0.25 «2 = 0.5 with carry 0 a, = (0.3125), = (010), a, = (0.5625),5 05625 * 2 = 0.125 with carry 1 0.125 * 2 = 0.25 with carry 0 | 0.25 x2 = 0.5 with carry 0 = (0.5625),, = (100), a, = (0.75). 0.75 x2 = 0.5 with carry 1 05%2 = 0 wiheany! | 2 a = 0.75) = (410), Step No. 5: a,=0 and 1,=2 +. code forE->00 = (010), and 1,=2 +. code for A-> 01 a, = (100), and =3 code for D=> 100 a,= and [,=3 . code for B—> a, = (111), and 1,=3. -. code forC->1 The code table can now be constructed as shown in table 2.21 below: Source Symbols P, Code {in binits PA E 5/16 2 58 A “4 2 2 D 36 3 ns B 8 3 a8 c us 328 fees. ‘Table 2.21 : Code table for example 2.15,san (2-15), the average 1enB % OTS COdG is given by NV From oqust gus woes 2 343 L=ypl=g*2 "16 88 a 1 = 24375 binitsimessage symbol. 2 is given by Fm equation (216) the entOry - ; 3 ; seetoat A Le Steg 24 tog 443 Ign tt Hs) = Dyplog y= 1g 98 5 * 48 “ig eS + = 2.2272 bts/message symbol. ant HO) 222 = Bf ing scheme = N= ~~ ¥ 100% = 22272 <..Eiicieney ofthe coding = Fae lo, Sig one = 937% Example 216: Te source emits the messages consisting of two symbols ea, tad ther probabilities are given in table 2.22. Design the source encoder cecoding algorithm end also find encoder efficiency. ing [V1 sem EC/TE, July/August gy Message M, Probability p,_| AA 9/32 ac 3/32 ce m6 ce 3/32 cA 3/32 BC. 3132 i 9132 e222; 5 Solation ‘Tle222: Messageprobabiliy table for example 2.16 Step No.1: For the basic source AA BB ag a 9n2 m2 am & CA BC occ ie Let of He 16 | Oe Step No.2; eee , | The following J ” Sequences of ae a= | Me computed 9 % = 5 = 0.28125 98 224228, 527 jy 70.5625 18,321 32°32 32 ot tem 32 32 “075 4 a = 0.65625 os 32 eee 32; 30 ag +322 Oy * 39 32" ag “O89 3 Step No. 3: ‘The smallest integer value of /, is found using L 22 — fo Pi 1 j=lo2he— Fori= 19242 0 gers for f, Fori=73 22 + Py + Uh > 16 « sinallest integer for f= 4. Step No. 4: iy slow: The decimal numbers o are expanded in binary form upto f places as given Plowee 70 a, = (0.28125). 0.28125 x2 = 0.5625 with carry 0 0.5625 x 2 = 0.125. with carry 1 | 0.125 x2 = 0.25. with cary 0 ch = (0.28125), ©6010), a, = (05625), 0.5625 x 2 = 0.125 with cary I 0.125 x2 = 0.25 with carry 0 0.25%2 = 05 withearry 0 with carry 1 05x2=0 0.31252 = 0,625 with carry 0 0.625 x2 = 0.25 "a, = (0.65625), A ++ code for AA -» 00 + code for BB > 01 code for AC > 1001 code for CB > ‘code for CA > 1100 code for BC» 1101 +. ede for CC LIL ‘The code table can Source Symbols a ays 9n16 ee 96 ag 38 ca 38 ce a8 be 3/8 ua a L. = 2.875 binits/message symbol. Ms) = Zp bes -(3 tog 2) x 2 + log 2) H(S) ~ 2.56 bits‘message symbol. HO) L 100% = 255. «100% ‘+ Encoder efficiency = 11p= 2875 Tie = 89.04 % 23 SHANNON-FANO ENCODING ALGORITHM. a —e encoding procedure for getting a compact code with minimum redundancy 2 ie sombots are arranged according to non-inereasing po ‘The symbols are divided into two groups s0 that the sum of probebil “Pproximately equal. sbrdaasi fies are approximately Same é mbsf the sub-group ae designated by a" and the 2 Subp, a 4: the qoup is sobavied into to more subgroups and step No. Sis repay, J ris proces is continued til frther sub-division i impossible. Bxanple 217 : Given the message ' 02,02, xu o Pe] ez procedure. : o Loz} 1 Caz Setation ven i o for] o Cor ‘Tare are two ways in which Step No. 2 given in the procedure can be applic, : iscass both the ways. ‘ o | oo7 | o [ 007 ati: (p92) . o L003} 0 Loos B i Cole ‘Table 225 : Code-table for exarp esgpe ae Fee u observation of tables 2.24 and 2.25 reveals that both are instantzzcous code x |o | Lo] 6 10 change inthe coding pattern. = yor} o [os I we; x fo fo for]o' Lar]. oor Average length, Te = 29 x [oo |oor}o [oor] o Cor] + | oo x loa}o Low}o La 003} 0 [003] 0 | 0000 Entropy, H(S) = 2.209 bit i A _ H® _ 2209 | - Code efficiencyn= = 55 | \ = 96.04% which remains the seme f The code-trees corresponding to both the ways can be drawn 2s shown in figure and (b), Entropy = O4tog ty. 1 04 +2% 0.2 fog ; 02 1 1 +01 log 55 + 0.07 log op 209 bits/messay '8e-symbo] 7. a. ig. 25: Code tree 0 (0 way) 24 way afr!—O Theory, eo coed. 137 e id fi rangle 218: ¥02 are given 4 messages Ny % % AN Xe with respective Probay, 2 Ree (shannon-Fano code) for these meu ty ii) The probability of “0's and ‘1's in the code are found using the formulas, - ty 1¥ vega cle (0) = + Lo[Numberot'0's in the code for x, oe fi PO =>! Sinthe code Fors} Pid nu (2.38) 1 redundancy of the code. ities of 0's and 1’s in the code. i Code |, in binits ee)! yt 5) o]° Gel! of 2 , | oz} o fer] o Loz} 1 | om 3 x fofo Lor}o Cor] o | oo} 3 ‘Table 2.26 : Code-table for example 2.18. (03) 2) + (02) (3) +(0.1) (3) itshmessage-symbol : 1 +031ogGi +02 tog +011 tos Beetenan et Canbe dawns shown in figure 26, se ee P(t) = t Didamberot 1's in the code for x) {Ps} nove (236) From the: code-table 2.26, we have (0) = 2 (3) (0.1) + 2) (02)+ (1) 0.3) + (0) (049) cs P(O) = 0.5263 and PQ) = 75 (0) OA) +) 2)+() 03) +04] (1) = 0.4737 fe 2.19 2 Consider a source $ = {5, 5,} with probabilities 3/4 and 1/4 respectively. a sannon-Fano code for source S, its 2 and 3% extensions. Caleulate efficiencies for cach case. ; Solution For the basic source: PB, Code | longi jin binits 534 I 1 14 ° 1 ‘Table 2.27 : Code-table for basic source of example 2.19 verge length, L = DP 1 7O+,0 = 1 binits/message-symbol 1 08 4il 5 + 3 lost = 0.8113 bits/message-symbol | Entropy, H(S) = u 2 4 aInformation Theory, 2, 0 PSUS 31.13% +. Code efficiency 1, = 1 1 1 3 I have 2° = 4 symbol : e rye extension wll have 2°= 4 symbols given by Fr aes 916 36,316 and 1/6 respect a ' De) s,s, and , with probabilities 916, o Loe} Gee} Code 2 Tho o [34] 0 [ae sae] ¢ o | a6] 0 [ss i oa} 2 sa pe] o [ane o | 364 | 0 [aie ss, | 3s] 0 [ane] o [ame] 1 | oo | 3 0 Laws} 0 Lives ss, | me] o | ae] ofits] o | 000 | 3 ‘Table 229 : Code-table for 3 extension of example 2.19 “abe 2.28: Codeabe for 2a extension of example 2.19 \ ‘The average length Ly of the 3" extended source is “The average length L, ofthe 2 extension is given by 1 = Dok A 2 9 9 9 3 t= Ent -(a)o-(@)o-Blor(QorlZ)o 3 3 1 3) 3 1 +(Zeo(Zo (ae *(a+(Z)o+ ao aaa DOGG, = 2.59375 binits/message-symbols Fy e = 1.6875 bits/message-symbol ‘The entropy of the 3" extended source is given by equation (1.34) as, toy of i 2 extended sues ven by equation (1.33) as H(S?) = 3 (8) HE) ~ 21) = 3 [0.8113] = 2.4339 bits/message-symbal + Code efficiency of the 3 extended source is = 1.6226 bitsmessage-symbol HS’) 2.4339 _ 1 Code ecincy of x ~ = 93.84% the 2 extended source ig 1, 25975 3 . n° = 93.84% . ‘Which is less than the-effiei icy of the 2"! extended source indeed!!!; And eer be a of equation (2.33) is violated! The following reasoning appears to be relevant for above case While following step no. 2 of the Shannon-Fano procedure for 3% extension HG!) _ 16296 | O WT egys 7 965% vie had Seuped (5,5,s,) and (s,s,s,) with a total probability of (36/64) in one group andthe remains Sb total probability of 28/64 inthe ther group. We could have BOF i. orgy ty (2716) in one group and the rest of the symbols with ane) Pos af ; . (Gaia other group. With the former groupings, the difference 16g 04 ™ 8164) whic is Less than the ference in otal Pay erence for the latter case. According to the rule formulated by Fans[27764] wer 0 [ica] 0 Leona «| Which is greater than th second-xtenson efficiency of 96.15%. can conclude tat the symibol with hi ice gm wi tigen po e memoryless source whose alphabet consi Note From te sbove be made to correspond to a Example 2.20 : Consider a discret equiprobable symbols, L, = 2.484375 binits/message-symbol = 2.4339 bitsimessage-symbol «Code efficiency of the 3* extended source is the use of a fixed length code for th 1 Tepresentation of such, 8 soree ig wurce alphabet S with K equiprotable symbols, e Sty equation (1.18) as yabols, the probability of ny systol eat 00 GisgivenY 1 PS) ~ Pam MTallK=1,2,3.. avenge code-word Length Li then given by equation (215) 2 The 5 ‘code-word length remains same for all code-words (xed length cod), then the Ta ee = hy = 1 (SAY) Me ad ‘vs we ean conclude that with K symbols inthe code, any other choice for kes 8 value for L,, no less than “?”. (8) The entropy of the source S is given by x 1 H(S) = dpe log (2) = log, K Since the K number of symbols are equiprobable, (refer equati H(S) _ H(S) 2. Coding efficiency = ,= [=~] jon 1.19) Given n,=1 2. D= H(S)=logk ‘. The condition to be satisfied is that = with eae ‘an alphabet of se¥€2 symbols Example 2.21 : A discrete memoryless source has Probabilities for its output, as described below: Symbol gg : S, Probability 025 0.25 0.125 Se os & 5 0.0625 0 Se 8, 0.125 0.142. Compute Shannon-Fano code for this soure®: Find coding efficiency. Exp source code has an efficiency of 100%. Solution 3, | 035 1 s,_| 025 ° 3, | ous 1 (ows)! s, | 0125 1 [ors] 0 s, [0125 o fous}! s, | 0.0625 o [0062s] 0 0.0625] 1 s,_|0.062s 0 |ooss} 0 [006s] 0 1 Code Table for example 2.21 The average length Li : = (025) (2) + (0.25) @) + + (0.0625) (4) + (0.0625) (4) = 2.625 binitsimessage-symbol ‘The extropy H(S) i given by 4, 1 = Dp log— er 1 - [25 ie ape +[o2 log +| (006 [ 25) aa)? j = 2.625 bits/message-symbol e * Coding efficiency = n, = HAS) _ 2.62: : - eacy « yx HS. 288 100m ‘Thus, Ds obs i roi ecary is 100%, reason ing eficiency to be 100% is as given below: when j22, P,7 0125. tes 55g =3-1 i=3, 770125, los 55 =32h i=4, p= 0125, logging “35h i=5, p,= 0.0625, toe gags 747 5 1 1 i=6, p,=0.0625, log—-~ los 59g95 = 1 Ts, we observe that p,= log 5 for alli=0,2, Example 2.22 : A source produces two symbols ce a coding scheme using Shannon: 45h Pe _ Gand hence the coding eficiency is 100% swith probabilities 78 and 18 8, and S, ding procedure to get2.colg Fano enc Tength of in binits 1 1 1 ple 232 ‘Table 2.32 : Code table for examInformation Theory, e a z an S rer than the required vale of 75%, Thus the 2™ extension code isthe required great seme: "a source produces two symbols ‘A’ and ‘B’ with probabilities 0.05 and 0.95 224 Met ‘a suitable binary code such thatthe efficiency of } [i sem ECE, July/August i = HOS) _ 054356 - 54.3564 + Codingefivizcy == ‘Now let us consider the 2™ ext i ; Entropy H(S) = 2, loa Symbols Probst 1 a 1 oO 2 = 0.95 log 095 +0.05 log 005 me | ee a = 0.2864 bitsmessage symbol a : we Toy ; H(S) _ 0.2864 «+ Coding efficiencyn, = HG) O26 = 28.64% This efficiency is less than the given efficiency of 65%. ‘Now let us consider the 2~ extension of the given source. iz = 13593 Son ote ate emcee gests HES) = 2) | 12 * Clete ena nese OL omeEntropy of the = (0.9025) HS) = 20) = (2) (0.2868) = 0.5728 bitsimessage-symbol Code eff a(S) nee, [Now let us consider the etextended source is given by iency of the 2” extended source is 05728 = 2 = 49.92% 11475 ieney is less than the given efficiency of 65%. 3M extension of the given code. Information Ther ng ) + (0.0475) (2) + (0.0475) (3) + (0.0025) 6 475 binits-message-symb0! sybole Peobilty pon oasis] 1 eA oo4stas]] 0 [oowsias] 1 1 sap aodsi2s | 0 Loossias] 1 ° 493 oossas|| 0 Poowsias] o [ooesras] 1 aa oeoars| 0 | oooasrs] 0 [oom] 0 [0002375] 1 foooas7s] 1 00 oasis | 0 |oo02srs] 0 | n0o2s2s} 0 | woo2s7s] 1 [oao2s7s] 0 ‘aan _oconsrs| 0 oom2srs} 0 | oo0zs7s| 0 [0002375] o [aoony73] 1 Asa__ooonias | 0 [0000125] 0 [o00012s] 0 | a00012s] ofacooi2s] 0 | v0000 | Table 236: ‘ode able for extension of sure of example 2.23 Average length L, = = (0.887375) (1 (0.000125) in (0.045125) (3) (3) + (0.002375) (3) (5) = 1.29975 binits/ra | MOP Ole Hesndd sowelegneaty et HGS) = 3H) = 30.2864) t = 0.8592 arcieney ofthe 3 extended sources rode 3 nae o - HS) _ 08592 = T= Lagp7s * 66.11% sgeffciency is greater than the required value of 65% Thus the 34 extension cod is Teng heme oe N-FANO TERNARY CODE sHANNO! 's observed for getting a binary code are sli ‘The seven stP ven below? ‘The symbols are a modified for temary code sei sranged according to non-inereasing probabi I in each group Te mols are divided int thes groups 30 hat the sum of proba imately equal. the 1* group are designated by a “2”, the 2" group by a “I” and the fed into three more sub-groups such that each sub-group ely same. soup are designated by a “2” the 2° sub-group by a “I” fo three more sub-groups each and step No. 5 is 4. This process is poo 2.24: Cor cole elisiency and re S™ {6p 5p 5p P= (0.3, 0.3, 0.12, 0.12, 0.06, 0.06, 0.04} with X (0, P Code | / in binits Ts] 2 mile 3] 1 lee, or) 0 [oi2] 2 a]. 2 oz} o Cow) 1 a} 2 005 | 0 [Foose] o [Loos] 2 | om) 3 006 | 0 | oo | o Looe] 1 } mm} 3 006 | 0 | oo, | o [oo ].o [vol 3 “Table 237 1 Code-table for example 224. A 148 Information Theory ay, st The average length L of the teary code is given by 1 L= Dp ist = (0.3) (1) + (0.3) (1) + (0.12) (2) + (0.12) (2) + (0.06) @) \ + (0.06) (3) + (0.04) (3) = 1.56 trinits/message-symbol The entropy H(S) in bits/message-symbol is given by ; 1 _¥p log HGS) = Dpiloss, A £03 tog 40.12 tog > + 0.12 log! = 03 log G3 +03 log 53 +012 108 G13 * O12 og Ss 1 1 I + 0.06 log 006 + 0.06 log 006 + 0.04 log 004 = 2.4491 bits/message-symbol From equation (2.19), the entropy in r-ary units/message symbol is given by H(S) WS) = ae H(S) _ 2.4419 Jog)3 log,3 ~ = 1.5452 temary units/message-symbol <. The temary coding efficiency is given by Forr=3, H,(S) = = HuS) 15482 | 4 me L 156 eee Te = 99.08% The corresponding code-tree can be drawn as shown in figure 2.7. 4, % ss Fig, 2.7 : Code-tree of example 2,24
You might also like
Data Compression Solutions
PDF
79% (19)
Data Compression Solutions
67 pages
Shannon Source Coding Theorem
PDF
No ratings yet
Shannon Source Coding Theorem
3 pages
Materi Source Coding
PDF
No ratings yet
Materi Source Coding
39 pages
Source Coding: Source Encoder Channel Encoder Digital Source Source Entropy Symbols Binary Sequence Modulator
PDF
No ratings yet
Source Coding: Source Encoder Channel Encoder Digital Source Source Entropy Symbols Binary Sequence Modulator
18 pages
Entropy
PDF
No ratings yet
Entropy
9 pages
Lecture 5
PDF
No ratings yet
Lecture 5
13 pages
Digital Communication Process Through Swayam
PDF
No ratings yet
Digital Communication Process Through Swayam
31 pages
Entropy 3
PDF
No ratings yet
Entropy 3
10 pages
Unit 2
PDF
No ratings yet
Unit 2
30 pages
3 Information Theory
PDF
No ratings yet
3 Information Theory
48 pages
Source Coding Shannon Fano Coding
PDF
No ratings yet
Source Coding Shannon Fano Coding
24 pages
Coding
PDF
No ratings yet
Coding
61 pages
Lecture 2 28 August, 2015: 2.1 An Example of Data Compression
PDF
No ratings yet
Lecture 2 28 August, 2015: 2.1 An Example of Data Compression
7 pages
Proof To Shannon's Source Coding Theorem
PDF
No ratings yet
Proof To Shannon's Source Coding Theorem
5 pages
chap2
PDF
No ratings yet
chap2
47 pages
TSBK08 Data Compression Exercises: Informationskodning, ISY, Link Opings Universitet, 2013
PDF
No ratings yet
TSBK08 Data Compression Exercises: Informationskodning, ISY, Link Opings Universitet, 2013
32 pages
Untitled
PDF
No ratings yet
Untitled
4 pages
Source 515 A
PDF
No ratings yet
Source 515 A
80 pages
Shanonfano and Huffman Coding
PDF
No ratings yet
Shanonfano and Huffman Coding
18 pages
EE 376A: Information Theory: Lecture Notes
PDF
No ratings yet
EE 376A: Information Theory: Lecture Notes
75 pages
Unit I Information Theory & Coding Techniques P I
PDF
No ratings yet
Unit I Information Theory & Coding Techniques P I
48 pages
Information Theory: Dr. Muhammad Imran Farid
PDF
No ratings yet
Information Theory: Dr. Muhammad Imran Farid
32 pages
cp467_12_lecture14_compression1
PDF
No ratings yet
cp467_12_lecture14_compression1
146 pages
Shannon Fano and Huffman
PDF
No ratings yet
Shannon Fano and Huffman
10 pages
Information Theory and Coding 2marks
PDF
No ratings yet
Information Theory and Coding 2marks
12 pages
Chapter 2 - Edited
PDF
No ratings yet
Chapter 2 - Edited
45 pages
Noise, Information Theory, and Entropy
PDF
No ratings yet
Noise, Information Theory, and Entropy
34 pages
Lecture 7 Source Coding 2024
PDF
No ratings yet
Lecture 7 Source Coding 2024
28 pages
Week 3
PDF
No ratings yet
Week 3
30 pages
ITC Unit1 Book
PDF
No ratings yet
ITC Unit1 Book
33 pages
Noise, Information Theory, and Entropy: CS414 - Spring 2007
PDF
No ratings yet
Noise, Information Theory, and Entropy: CS414 - Spring 2007
44 pages
Lecture 6 PDF
PDF
No ratings yet
Lecture 6 PDF
5 pages
Kalo Solutions
PDF
No ratings yet
Kalo Solutions
36 pages
Data Compression Arithmetic Coding
PDF
No ratings yet
Data Compression Arithmetic Coding
38 pages
Introduction To Information Theory and Coding
PDF
No ratings yet
Introduction To Information Theory and Coding
46 pages
ETN3046 Chapter 6
PDF
No ratings yet
ETN3046 Chapter 6
31 pages
a2 sol
PDF
No ratings yet
a2 sol
7 pages
Coding Theory Lecture Notes
PDF
100% (1)
Coding Theory Lecture Notes
73 pages
Solution Manual: A. Viterbi K. Omura
PDF
No ratings yet
Solution Manual: A. Viterbi K. Omura
207 pages
شانون
PDF
No ratings yet
شانون
3 pages
Ranjan Bose Information Theory Coding and Cryptography Solution Manual
PDF
89% (38)
Ranjan Bose Information Theory Coding and Cryptography Solution Manual
61 pages
Ranjan Bose Information Theory Coding and Cryptography Solution Manual PDF
PDF
100% (1)
Ranjan Bose Information Theory Coding and Cryptography Solution Manual PDF
61 pages
3 Source Coding
PDF
No ratings yet
3 Source Coding
31 pages
Problem Set 4: MAS160: Signals, Systems & Information For Media Technology
PDF
No ratings yet
Problem Set 4: MAS160: Signals, Systems & Information For Media Technology
4 pages
Rudiger Urbanke Lecture PDF
PDF
No ratings yet
Rudiger Urbanke Lecture PDF
70 pages
Notes Shannon
PDF
No ratings yet
Notes Shannon
6 pages
Mobile Communicaton Engineering: Review On Fundamental Limits On Communications
PDF
No ratings yet
Mobile Communicaton Engineering: Review On Fundamental Limits On Communications
31 pages
Information Coding Techniques
PDF
No ratings yet
Information Coding Techniques
42 pages
Information Theory Coding
PDF
No ratings yet
Information Theory Coding
9 pages
Data Compression: Chapter - 2 Mathematical Preliminaries For Lossless Compression
PDF
100% (2)
Data Compression: Chapter - 2 Mathematical Preliminaries For Lossless Compression
26 pages
Information and Digital Transmission: Haykin Chapter 9 Carlson Chapter 16
PDF
No ratings yet
Information and Digital Transmission: Haykin Chapter 9 Carlson Chapter 16
27 pages
ECE452s L2
PDF
No ratings yet
ECE452s L2
26 pages
Information Theory Coding 6 Sem Ec Notes
PDF
91% (22)
Information Theory Coding 6 Sem Ec Notes
174 pages
Unit 1: Information Theory and Coding
PDF
No ratings yet
Unit 1: Information Theory and Coding
52 pages
1 2 M 1 M 1 M 1 2 M
PDF
No ratings yet
1 2 M 1 M 1 M 1 2 M
5 pages
Lossless Data Compression
PDF
No ratings yet
Lossless Data Compression
77 pages
Kjom351 09 PDF
PDF
No ratings yet
Kjom351 09 PDF
7 pages
Student Reg Form-2024
PDF
No ratings yet
Student Reg Form-2024
1 page
Module 3 Complete Notes
PDF
No ratings yet
Module 3 Complete Notes
123 pages
21EC63
PDF
No ratings yet
21EC63
3 pages
SP MODULE 5 PPT L4C
PDF
No ratings yet
SP MODULE 5 PPT L4C
2 pages
sp'Module 4.pdf'
PDF
No ratings yet
sp'Module 4.pdf'
70 pages
Gba-Eixg-Ybq - 27 Mar 2024
PDF
No ratings yet
Gba-Eixg-Ybq - 27 Mar 2024
4 pages
Module 2
PDF
No ratings yet
Module 2
45 pages
Related titles
Click to expand Related Titles
Carousel Previous
Carousel Next
Data Compression Solutions
PDF
Data Compression Solutions
Shannon Source Coding Theorem
PDF
Shannon Source Coding Theorem
Materi Source Coding
PDF
Materi Source Coding
Source Coding: Source Encoder Channel Encoder Digital Source Source Entropy Symbols Binary Sequence Modulator
PDF
Source Coding: Source Encoder Channel Encoder Digital Source Source Entropy Symbols Binary Sequence Modulator
Entropy
PDF
Entropy
Lecture 5
PDF
Lecture 5
Digital Communication Process Through Swayam
PDF
Digital Communication Process Through Swayam
Entropy 3
PDF
Entropy 3
Unit 2
PDF
Unit 2
3 Information Theory
PDF
3 Information Theory
Source Coding Shannon Fano Coding
PDF
Source Coding Shannon Fano Coding
Coding
PDF
Coding
Lecture 2 28 August, 2015: 2.1 An Example of Data Compression
PDF
Lecture 2 28 August, 2015: 2.1 An Example of Data Compression
Proof To Shannon's Source Coding Theorem
PDF
Proof To Shannon's Source Coding Theorem
chap2
PDF
chap2
TSBK08 Data Compression Exercises: Informationskodning, ISY, Link Opings Universitet, 2013
PDF
TSBK08 Data Compression Exercises: Informationskodning, ISY, Link Opings Universitet, 2013
Untitled
PDF
Untitled
Source 515 A
PDF
Source 515 A
Shanonfano and Huffman Coding
PDF
Shanonfano and Huffman Coding
EE 376A: Information Theory: Lecture Notes
PDF
EE 376A: Information Theory: Lecture Notes
Unit I Information Theory & Coding Techniques P I
PDF
Unit I Information Theory & Coding Techniques P I
Information Theory: Dr. Muhammad Imran Farid
PDF
Information Theory: Dr. Muhammad Imran Farid
cp467_12_lecture14_compression1
PDF
cp467_12_lecture14_compression1
Shannon Fano and Huffman
PDF
Shannon Fano and Huffman
Information Theory and Coding 2marks
PDF
Information Theory and Coding 2marks
Chapter 2 - Edited
PDF
Chapter 2 - Edited
Noise, Information Theory, and Entropy
PDF
Noise, Information Theory, and Entropy
Lecture 7 Source Coding 2024
PDF
Lecture 7 Source Coding 2024
Week 3
PDF
Week 3
ITC Unit1 Book
PDF
ITC Unit1 Book
Noise, Information Theory, and Entropy: CS414 - Spring 2007
PDF
Noise, Information Theory, and Entropy: CS414 - Spring 2007
Lecture 6 PDF
PDF
Lecture 6 PDF
Kalo Solutions
PDF
Kalo Solutions
Data Compression Arithmetic Coding
PDF
Data Compression Arithmetic Coding
Introduction To Information Theory and Coding
PDF
Introduction To Information Theory and Coding
ETN3046 Chapter 6
PDF
ETN3046 Chapter 6
a2 sol
PDF
a2 sol
Coding Theory Lecture Notes
PDF
Coding Theory Lecture Notes
Solution Manual: A. Viterbi K. Omura
PDF
Solution Manual: A. Viterbi K. Omura
شانون
PDF
شانون
Ranjan Bose Information Theory Coding and Cryptography Solution Manual
PDF
Ranjan Bose Information Theory Coding and Cryptography Solution Manual
Ranjan Bose Information Theory Coding and Cryptography Solution Manual PDF
PDF
Ranjan Bose Information Theory Coding and Cryptography Solution Manual PDF
3 Source Coding
PDF
3 Source Coding
Problem Set 4: MAS160: Signals, Systems & Information For Media Technology
PDF
Problem Set 4: MAS160: Signals, Systems & Information For Media Technology
Rudiger Urbanke Lecture PDF
PDF
Rudiger Urbanke Lecture PDF
Notes Shannon
PDF
Notes Shannon
Mobile Communicaton Engineering: Review On Fundamental Limits On Communications
PDF
Mobile Communicaton Engineering: Review On Fundamental Limits On Communications
Information Coding Techniques
PDF
Information Coding Techniques
Information Theory Coding
PDF
Information Theory Coding
Data Compression: Chapter - 2 Mathematical Preliminaries For Lossless Compression
PDF
Data Compression: Chapter - 2 Mathematical Preliminaries For Lossless Compression
Information and Digital Transmission: Haykin Chapter 9 Carlson Chapter 16
PDF
Information and Digital Transmission: Haykin Chapter 9 Carlson Chapter 16
ECE452s L2
PDF
ECE452s L2
Information Theory Coding 6 Sem Ec Notes
PDF
Information Theory Coding 6 Sem Ec Notes
Unit 1: Information Theory and Coding
PDF
Unit 1: Information Theory and Coding
1 2 M 1 M 1 M 1 2 M
PDF
1 2 M 1 M 1 M 1 2 M
Lossless Data Compression
PDF
Lossless Data Compression
Kjom351 09 PDF
PDF
Kjom351 09 PDF
Student Reg Form-2024
PDF
Student Reg Form-2024
Module 3 Complete Notes
PDF
Module 3 Complete Notes
21EC63
PDF
21EC63
SP MODULE 5 PPT L4C
PDF
SP MODULE 5 PPT L4C
sp'Module 4.pdf'
PDF
sp'Module 4.pdf'
Gba-Eixg-Ybq - 27 Mar 2024
PDF
Gba-Eixg-Ybq - 27 Mar 2024
Module 2
PDF
Module 2