DC Sessional I
DC Sessional I
SECTION A
SECTION B
Attempt any three parts of the following: (3x2=6)
Q1. What do you mean by Uniquely Decodable code? Determine whether the following codes are
uniquely decodable or not:
(i) {0,01,11,111} (iii){0,01,110,111}
(ii) {1,10,110,111} (iv) {0,01,10}
Q2. Find Entropy for the given sequence {1, 2, 3, 2, 3, 4, 5, 4, 5, 6, 7, 8, 9, 10, 8 ,9}
Q3. Differentiate between Lossless and lossy compression.
Q4. Draw the Huffman tree for the following symbols whose frequency occurrence in a message
text is started along with their symbol below: A:15, B:6, C:7, D:12, E:25, F:4, G:6, H:10, I: 15
Decode the message 1110100010111011
SECTION-C
Attempt any two parts of the following: (2x5=10)
Q1. Explain Adaptive Huffman coding procedure and Encode The message by Adaptive Huffman
Coding: {a, a, r, d, v, a}
Q2. Discuss various Data Compression models in detail.
Q3. Discuss the importance of modeling and coding in the compression process with example. How
does information theory relate to lossless compression? Can you explain some key concepts in
information theory and their relevance to compression?
Q4. Explain the difference between Minimum variance Huffman code and Huffman code with
example. From an alphabet A = {a1, a2, a3, a4, a5} with probabilities P(a1) = 0.15, P(a2) =
0.04, P(a3) = 0.26, P(a4) = 0.05 and P(a5) = 0.50.
(i) Calculate the entropy of this source. (ii) Find the Huffman Code for this source.
(iii) Find the average length of the code in (ii) and its redundancy.