0% found this document useful (0 votes)
4 views

dms final 1

Uploaded by

basava644265
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

dms final 1

Uploaded by

basava644265
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

4

1. Introduction
Image compression is the process of encoding digital images to reduce their file size
while maintaining an acceptable level of quality. This technique has become
indispensable in today's digital age, where images are widely used in various fields
such as social media, healthcare, and surveillance.

The core goal of image compression is to eliminate redundant data without


compromising essential details. By utilizing mathematical models, compression
algorithms like JPEG and PNG achieve significant file size reductions, making storage
and transmission more efficient.

This report explores the principles of image compression, focusing on Huffman coding
and its role in achieving efficient data representation. We also examine real-world
applications and emerging trends in this domain.
5

2. Problem Statement

With the exponential growth of digital media, the demand for efficient storage and
transmission of images has surged. High-resolution images, while visually appealing,
occupy substantial storage space and require considerable bandwidth for transmission.

For example, uploading a raw, uncompressed image to a website could lead to slow
loading times and consume excessive storage resources. Image compression addresses
these challenges by reducing file sizes while preserving the visual quality necessary for
specific applications.

The primary problem lies in achieving a balance between compression ratio and
image quality, as excessive compression can lead to visible artifacts.
6

3. Huffman Coding Concept in Image Compression

Huffman coding, a lossless data compression technique, plays a pivotal role in image
compression algorithms like JPEG. The method assigns variable-length codes to
different pixel intensities based on their frequency of occurrence. Common intensities
receive shorter codes, reducing overall data size.

For instance, in a grayscale image, dark pixels (frequent) may be represented by a


shorter binary code, while rare bright pixels receive longer codes. This principle
ensures that the average length of the encoded data is minimized.

In the JPEG standard, Huffman coding is applied after quantization, where less
significant details are removed from the image, making the compression highly
efficient.
7

4. Overall View of the Case Study

This case study delves into the application of Huffman coding in JPEG
compression. By examining real-world scenarios, we understand how the technique
reduces image redundancy and enhances storage efficiency.

The study also highlights the integration of discrete mathematics, particularly


probability and graph theory, in developing robust compression algorithms. These
mathematical foundations ensure that compression is both effective and adaptable to
different types of image data.
8

5. Implementation with Results


The implementation of JPEG compression involves several key stages: color space
conversion, discrete cosine transform (DCT), quantization, and entropy coding
(Huffman coding).

During testing, a high-resolution image (10MB) was compressed to 2MB with negligible
quality loss, demonstrating the effectiveness of this approach. The results highlight
Huffman coding's ability to retain critical details while achieving significant file size
reductions.

1.Frequency Analysis: Identify how frequently each pixel value occurs in the image.
2.Tree Construction: Construct a binary tree, placing the most frequent values closer to
the root.
3.Code Assignment: Assign shorter binary codes to more frequent values and longer
codes to less frequent ones.
4.Encoding: Replace the original image data with these binary codes.

Code
#include <stdio.h>
#include <stdlib.h>

#define MAX_TREE_HT 100

struct MinHeapNode {
char data;
unsigned freq;
struct MinHeapNode *left, *right;
};

struct MinHeap {
unsigned size;
unsigned capacity;
struct MinHeapNode** array;
};
9

struct MinHeapNode* newNode(char data, unsigned freq) {


struct MinHeapNode* temp = (struct MinHeapNode*)malloc(sizeof(struct
MinHeapNode));
temp->left = temp->right = NULL;
temp->data = data;
temp->freq = freq;
return temp;
}

struct MinHeap* createMinHeap(unsigned capacity) {


struct MinHeap* minHeap = (struct MinHeap*)malloc(sizeof(struct MinHeap));
minHeap->size = 0;
minHeap->capacity = capacity;
minHeap->array = (struct MinHeapNode**)malloc(minHeap->capacity *
sizeof(struct MinHeapNode*));
return minHeap;
}

void swapMinHeapNode(struct MinHeapNode** a, struct MinHeapNode** b) {


struct MinHeapNode* t = *a;
*a = *b;
*b = t;
}

void minHeapify(struct MinHeap* minHeap, int idx) {


int smallest = idx;
int left = 2 * idx + 1;
int right = 2 * idx + 2;

if (left < minHeap->size && minHeap->array[left]->freq < minHeap-


>array[smallest]->freq)
smallest = left;

if (right < minHeap->size && minHeap->array[right]->freq < minHeap-


10

>array[smallest]->freq)
smallest = right;

if (smallest != idx) {
swapMinHeapNode(&minHeap->array[smallest], &minHeap->array[idx]);
minHeapify(minHeap, smallest);
}
}

int isSizeOne(struct MinHeap* minHeap) {


return (minHeap->size == 1);
}

struct MinHeapNode* extractMin(struct MinHeap* minHeap) {


struct MinHeapNode* temp = minHeap->array[0];
minHeap->array[0] = minHeap->array[minHeap->size - 1];
--minHeap->size;
minHeapify(minHeap, 0);
return temp;
}

void insertMinHeap(struct MinHeap* minHeap, struct MinHeapNode*


minHeapNode) {
++minHeap->size;
int i = minHeap->size - 1;
while (i && minHeapNode->freq < minHeap->array[(i - 1) / 2]->freq) {
minHeap->array[i] = minHeap->array[(i - 1) / 2];
i = (i - 1) / 2;
}
minHeap->array[i] = minHeapNode;
}

void buildMinHeap(struct MinHeap* minHeap) {


int n = minHeap->size - 1;
int i;
11

for (i = (n - 1) / 2; i >= 0; --i)


minHeapify(minHeap, i);
}

void printArr(int arr[], int n) {


int i;
for (i = 0; i < n; ++i)
printf("%d", arr[i]);
printf("\n");
}

int isLeaf(struct MinHeapNode* root) {


return !(root->left) && !(root->right);
}

struct MinHeap* createAndBuildMinHeap(char data[], int freq[], int size) {


struct MinHeap* minHeap = createMinHeap(size);
for (int i = 0; i < size; ++i)
minHeap->array[i] = newNode(data[i], freq[i]);
minHeap->size = size;
buildMinHeap(minHeap);
return minHeap;
}

struct MinHeapNode* buildHuffmanTree(char data[], int freq[], int size) {


struct MinHeapNode *left, *right, *top;
struct MinHeap* minHeap = createAndBuildMinHeap(data, freq, size);

while (!isSizeOne(minHeap)) {
left = extractMin(minHeap);
right = extractMin(minHeap);
top = newNode('$', left->freq + right->freq);
top->left = left;
top->right = right;
insertMinHeap(minHeap, top);
12

}
return extractMin(minHeap);
}

void printCodes(struct MinHeapNode* root, int arr[], int top) {


if (root->left) {
arr[top] = 0;
printCodes(root->left, arr, top + 1);
}
if (root->right) {
arr[top] = 1;
printCodes(root->right, arr, top + 1);
}
if (isLeaf(root)) {
printf("%c: ", root->data);
printArr(arr, top);
}
}

void HuffmanCodes(char data[], int freq[], int size) {


struct MinHeapNode* root = buildHuffmanTree(data, freq, size);
int arr[MAX_TREE_HT], top = 0;
printCodes(root, arr, top);
}

int main() {
char arr[] = { 'a', 'b', 'c', 'd', 'e', 'f' };
int freq[] = { 5, 9, 12, 13, 16, 45 };
int size = sizeof(arr) / sizeof(arr[0]);
HuffmanCodes(arr, freq, size);
return 0;
}
13

Output

f: 0
c: 10
d: 110
a: 1110
b: 1111
e: 11100

6. Applications and Benefits


Image compression finds applications in various domains, such as:
1. **Digital Photography**: Compressed images reduce storage requirements in devices
like smartphones and cameras.
2. **Web Development**: Smaller image files improve website loading speeds and
user experience.
3. **Medical Imaging**: Compression techniques ensure efficient storage of
diagnostic images without compromising diagnostic value.
4. **Video Streaming**: Compressed frames reduce bandwidth requirements,
enabling smoother streaming experiences.

The benefits of image compression extend to environmental sustainability, as smaller


files reduce energy consumption in data storage and transfer.
14
15

7. Limitations and Challenges


Despite its advantages, image compression has certain limitations. Lossy methods, such
as JPEG, may introduce visual artifacts like blurring or blockiness. These artifacts
become noticeable at high compression levels, reducing user satisfaction.

Challenges also include handling high-resolution images efficiently and


maintaining compatibility with emerging display technologies like 8K screens.
16

8. Advanced Compression Techniques


Beyond traditional methods, advanced techniques like wavelet-based compression and
fractal compression are gaining traction. These methods promise higher compression
ratios with better visual quality, particularly for medical and scientific imaging
applications.

9. Emerging Trends in Image Compression


Recent advancements include the use of machine learning algorithms, such as
convolutional neural networks (CNNs), to perform adaptive compression. These models
can dynamically adjust compression parameters based on image content, achieving
optimal results.
17

10. Conclusion
Image compression is a cornerstone of the digital media landscape, enabling efficient
storage, transmission, and accessibility of images. By leveraging mathematical
principles like Huffman coding, compression algorithms achieve impressive
performance metrics.

11. Takeaways of the Case Study


1. Huffman coding exemplifies the power of discrete mathematics in real-
world applications.
2. Image compression significantly impacts various industries, from healthcare
to entertainment.
3. Continuous research and innovation are vital to addressing future challenges.

12. References
1. Gonzalez and Woods, Digital Image Processing.
2. David Salomon, Data Compression: The Complete Reference. 3.
Research papers on Huffman coding and JPEG compression
algorithms.

You might also like