0% found this document useful (0 votes)
35 views1 page

DC Sessional I

Uploaded by

Mir
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views1 page

DC Sessional I

Uploaded by

Mir
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Roll No

Vivekananda College of Technology & Management, Aligarh


Sessional Test I (AKTU) EVEN Semester Examination - 2024-25

Course Name: - B. Tech. Year of Study: - 3rdYr. (A & B)

Name of Department: - Computer Science & Engg. TIME: 1:30 Hours.

Subject: -DATA COMPRESSION (BCS-064) Max Marks: - 20

SECTION A

All parts are compulsory from this section: (4x1=4)


Q1. What is Data Compression? Why it is needed?
Q2. Define Entropy. Briefly.
Q3. What are the different measures of performance of data compression algorithm?
Q4. Define compression ratio. A representation compresses a 10 MB file to 2 MB. Find
compression Ratio.

SECTION B
Attempt any three parts of the following: (3x2=6)
Q1. What do you mean by Uniquely Decodable code? Determine whether the following codes are
uniquely decodable or not:
(i) {0,01,11,111} (iii){0,01,110,111}
(ii) {1,10,110,111} (iv) {0,01,10}
Q2. Find Entropy for the given sequence {1, 2, 3, 2, 3, 4, 5, 4, 5, 6, 7, 8, 9, 10, 8 ,9}
Q3. Differentiate between Lossless and lossy compression.
Q4. Draw the Huffman tree for the following symbols whose frequency occurrence in a message
text is started along with their symbol below: A:15, B:6, C:7, D:12, E:25, F:4, G:6, H:10, I: 15
Decode the message 1110100010111011

SECTION-C
Attempt any two parts of the following: (2x5=10)
Q1. Explain Adaptive Huffman coding procedure and Encode The message by Adaptive Huffman
Coding: {a, a, r, d, v, a}
Q2. Discuss various Data Compression models in detail.
Q3. Discuss the importance of modeling and coding in the compression process with example. How
does information theory relate to lossless compression? Can you explain some key concepts in
information theory and their relevance to compression?
Q4. Explain the difference between Minimum variance Huffman code and Huffman code with
example. From an alphabet A = {a1, a2, a3, a4, a5} with probabilities P(a1) = 0.15, P(a2) =
0.04, P(a3) = 0.26, P(a4) = 0.05 and P(a5) = 0.50.
(i) Calculate the entropy of this source. (ii) Find the Huffman Code for this source.
(iii) Find the average length of the code in (ii) and its redundancy.

You might also like