0% found this document useful (0 votes)
49 views

Dip Unit 1

Digital image processing focuses on improving pictorial information for human interpretation and processing image data for storage and transmission. A digital image represents a 2D image as pixels containing values like gray levels or colors. Common formats include grayscale, RGB, and RGBA. The history of digital image processing began in the 1920s with transmitting newspaper images via cable. Improvements in computing and space exploration drove work in the 1960s-70s. Today, digital image processing is used widely for tasks like enhancement, inspection, visualization and interfaces.

Uploaded by

sheikdavood
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views

Dip Unit 1

Digital image processing focuses on improving pictorial information for human interpretation and processing image data for storage and transmission. A digital image represents a 2D image as pixels containing values like gray levels or colors. Common formats include grayscale, RGB, and RGBA. The history of digital image processing began in the 1920s with transmitting newspaper images via cable. Improvements in computing and space exploration drove work in the 1960s-70s. Today, digital image processing is used widely for tasks like enhancement, inspection, visualization and interfaces.

Uploaded by

sheikdavood
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 75

18ECE011T-DIGITAL IMAGE

PROCESSING

UNIT-1
DIGITAL IMAGE
FUNDAMENTALS
Introduction

“One picture is worth more than ten


thousand words”
Anonymous
What is a Digital Image?
A digital image is a representation of a two-
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

dimensional image as a finite set of digital


values, called picture elements or pixels
What is a Digital Image?
(cont…)
Pixel values typically represent gray levels,
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

colours, heights, opacities etc


Remember digitization implies that a digital
image is an approximation of a real scene
1 pixel
What is a Digital Image?
(cont…)
Common image formats include:
– 1 sample per point (B&W or Grayscale)
– 3 samples per point (Red, Green, and Blue)
– 4 samples per point (Red, Green, Blue, and “Alpha”, a.k.a.
Opacity)

For most of this course we will focus on grey-scale images


What is Digital Image
Processing?
Digital image processing focuses on two major
tasks
– Improvement of pictorial information for human
interpretation
– Processing of image data for storage,
transmission and representation for autonomous
machine perception
Some argument about where image processing
ends and fields such as image analysis and
computer vision start
What is DIP? (cont…)
The continuum from image processing to
computer vision can be broken up into low-,
mid- and high-level processes
Low Level Process Mid Level Process High Level Process
Input: Image Input: Image Input: Attributes
Output: Image Output: Attributes Output: Understanding
Examples: Noise Examples: Object Examples: Scene
removal, image recognition, understanding,
sharpening segmentation autonomous navigation

In this course we will


stop here
History of Digital Image
Processing
Early 1920s: One of the first applications of
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

digital imaging was in the news-


paper industry
– The Bartlane cable picture
Early digital image
transmission service
– Images were transferred by submarine cable
between London and New York
– Pictures were coded for cable transfer and
reconstructed at the receiving end on a
telegraph printer
History of DIP (cont…)
Mid to late 1920s: Improvements to the
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Bartlane system resulted in higher quality


images
– New reproduction
processes based
on photographic
techniques
– Increased number
Improved
of tones in digital image Early 15 tone digital
reproduced images image
History of DIP (cont…)
1960s: Improvements in computing technology
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

and the onset of the space race led to a surge


of work in digital image processing
– 1964: Computers used to
improve the quality of
images of the moon taken
by the Ranger 7 probe
– Such techniques were used
in other space missions
A picture of the moon taken
including the Apollo landings by the Ranger 7 probe
minutes before landing
History of DIP (cont…)
1970s: Digital image processing begins to
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

be used in medical applications


– 1979: Sir Godfrey N.
Hounsfield & Prof. Allan M.
Cormack share the Nobel
Prize in medicine for the
invention of tomography,
the technology behind
Typical head slice CAT
Computerised Axial image
Tomography (CAT) scans
History of DIP (cont…)
1980s - Today: The use of digital image
processing techniques has exploded and they
are now used for all kinds of tasks in all kinds of
areas
– Image enhancement/restoration
– Artistic effects
– Medical visualisation
– Industrial inspection
– Law enforcement
– Human computer interfaces
Examples: Image Enhancement
One of the most common uses of DIP
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

techniques: improve quality, remove noise


etc
Examples: Industrial Inspection
Human operators are
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

expensive, slow and


unreliable
Make machines do the
job instead
Industrial vision systems

are used in all kinds of


industries
Can we trust them?
Machines vs humans
• Among the five senses, vision is
considered to be vital one for a human
being.
• But a human being can perceive only
visible part of the electromagnetic
spectrum.
• But machines can span the entire range of
electromagnetic spectrum from gamma to
radio waves.
Examples of Gamma Ray Images
(Bone scan and PET images)
X-ray Imaging
• It is also used in medicine and astronomy.
• We can get images of blood vessels in
angiography.
• It is also used in Computerized Axial
Tomography (CAT) to generate 3-D
rendition of a patient.
• High energy X-ray images are used in
industrial processes (electronic circuit
board).
Examples of X-ray Images
(Chest X-ray and Circuit boards)
Examples: Law Enforcement
Image processing
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

techniques are used


extensively by law
enforcers
– Number plate
recognition for speed
cameras/automated
toll systems
– Fingerprint recognition
– Enhancement of CCTV
images
Fundamental steps in Digital
Image Processing
Key Stages in Digital Image Processing

Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Key Stages in Digital Image
Processing:
Image Aquisition
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Key Stages in Digital Image
Processing:
Image Enhancement
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Key Stages in Digital Image
Processing:
Image Restoration
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Key Stages in Digital Image
Processing:
Morphological Processing
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Key Stages in Digital Image
Processing:
Segmentation
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Key Stages in Digital Image
Processing:
Object Recognition
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Key Stages in Digital Image
Processing:
Representation & Description
Images taken from Gonzalez & Woods, Digital Image Processing (2002)

Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Key Stages in Digital Image
Processing:
Image Compression
Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Key Stages in Digital Image
Processing:
Colour Image Processing
Image Morphological
Restoration Processing

Image
Segmentation
Enhancement

Image Object
Acquisition Recognition

Representation
Problem Domain
& Description
Colour Image Image
Processing Compression
Fundamental steps of DIP
• Image acquisition – This stage involves preprocessing,
such as scaling.
• Image enhancement – Here we bring out details that
were obscured or highlight some features of interest in
an image. (eg) increasing the contrast of an image.
• Image Restoration – Here we talk about how to
improve the appearance of an image. Unlike
enhancement, which is subjective, this is objective.
• Color Image Processing – Due to Internet, this area is
becoming popular. Various color models are worthy to
know.
• Wavelets – Representing the images in various
degrees of resolution in the basis of wavelets.
Fundamental steps of DIP
• Compression – It is a technique for reducing the storage required
to save an image or bandwidth needed to transmit.
• Morphological Processing – It deals with tools for extracting
image components that are useful in the representation and
description of shape.
• Segmentation – These procedures partition an image into its
constituent parts or objects.
• Representation and description – It follows the output of a
segmentation stage.
• It uses either the boundary of a region or all the points in the
region itself.
• Description ( also called feature selection) deals with extracting
attributes or are basic for differentiating one class of objects from
another.
• Recognition – It is the process that assigns a label (eg. Vehicle)
to an object based on its descriptors.
Components of an Image
Processing System
Basic components of a general-purpose system
used for digital image processing
• Image sensors – Two elements are needed to
acquire digital images.
• First one is the physical device that is sensitive to
energy radiated by the object that we want to
image.
• The second one, called the digitizer, is a device for
converting the output of the physical sensing
device into digital form.
• (eg) in a digital video camera, the sensors produce
an electrical output proportional to light intensity.
• The digitizer converts these outputs to digital data.
Basic components of a general-purpose
system used for digital image processing
• Specialized Image Processing Hardware -
• It consists of digitizer plus hardware that performs
other primitive operations such as an arithmetic logic
unit (ALU), which performs arithmetic and logical
operations on entire image.
• This type of hardware is also called as front-end
subsystem and its characteristic is speed.
• This unit does things that require fast data throughputs
which main computer cannot handle.
• Computer – In an image processing system it is a
general-purpose computer.
• Software – It consists of specialized modules that does
specific tasks (eg. matlab)
Basic components of a general-purpose
system used for digital image processing
• Mass storage – An image of 1024 X 1024 size, storing the
intensity of each pixel in 8 bits, requires one megabyte of
storage.
• For short-time storage, we can use computer memory.
• Another method is to use a specialized board called frame
buffer, that store one or more images and can be accessed
rapidly.
• They enable us to instantaneously zoom, scroll (vertical shift)
and pan (horizontal shift).
• For on-line storage magnetic disks or optical-media are used.
• The archival storage needs massive capacity but are
accessed infrequently.
• Image Displays – These are mainly color TV monitors.
• Hardcopy – These devices include laser printers, film
cameras, inkjet units, etc.
Brightness Adaptation
• There are 2 phenomena which clearly
demonstrate that the perceived brightness
is not a simple function of intensity.
• The visual system tends to undershoot or
overshoot around the boundary of regions
of different intensities.
• These scalloped bands near the
boundaries are called Mach bands.
Mach Band

250

200
Intensity

150

100

50

0
0 50 100 150 200 250 300
Pixel Position
Simultaneous Contrast
• The simultaneous contrast means a
region’s perceived brightness does not
depend simply on its intensity.
• For example, all centre squares have
exactly same intensity.
• But they appear to the eye to become
darker as the background gets lighter.
Simultaneous Contrast
Chess Board
Exercises
• Write a matlab code to create the Mach bands and
observe the transition of intensities for various pixel
positions in the Mach band.
• Write a matlab code to simulate simultanious contrast -
create a small Square(gray valued) inside a larger(black)
square - change the intensity of black square and
observe the contrast variation.
• Write a matlab code to generate a chessboard consisting
of eight alternating black and white squares.
• Write a matlab code to generate the 1D barcodes (EAN-
13) that are used in ISBN.
• Write a matlab code to generate the 2D barcodes.
• Write a matlab code to generate the visual code for the
given ID (83 bits)
Image Formation Model
Image Formation Model
• When an image is generated from a physical process, its
values are proportional to energy radiated by a physical
source (em waves).
• Hence f(x,y) must be nonzero and finite. That is
0  f ( x, y )  
• The function f(x,y) is characterized by 2 components.
• 1) The amount of source illumination incident on the
scene being viewed called illumination component
denoted as i(x,y)
• 2) The amount of illumination reflected by the objects in
the scene called reflectance component denoted as
r(x,y).
Image Formation Model
f ( x, y )  i ( x, y ) r ( x, y )
• where 0  f ( x, y )  

0  r ( x, y )  1
• and

• When r(x,y) is zero, we have total absorption and when it


is 1, we have total reflectance.
• The nature of i(x,y) is determined by the illumination
source and r(x,y) is determined by the characteristics of
the imaged objects.
• If the images are formed via transmission of illumination
through a medium such as a chest X-ray, we use
transmissivity instead of reflectivity function.
Illumination values of objects
• On a clear day, sun produces about 90000
lm/m2 of illumination on the surface of the Earth.
It is about 10000 lm/ m2 on a cloudy day.
• On a clear evening, a full moon gives about 0.1
lm/ m2 of illumination.
• The typical illumination of an office is about 1000
lm/m2.
• The typical values of r(x,y) are: 0.01 for black
velvet, 0.65 for stainless steel, 0.8 for white wall
paint, 0.90 for silver plated metal and 0.93 for
snow.
Image Sampling
• The output of many sensors is continuous
voltage.
• To get a digital image, we need to convert this
voltage into digital form.
• But this involves 2 processes, namely sampling
and quantization.
• An image is continuous with respect to x and y
coordinates and in amplitude.
• Digitizing the coordinates is called sampling.
• Digitizing the amplitude is called quantization.
Sampling and Quantization
• Let us consider a gray scale image.
• We can take intensity values along a particular line.
• Subsequently, we consider few equally spaced points
(discrete locations) along this line and mark the intensity
values at these points called sampling points.
• But the values of amplitude are continuous in nature.
• The gray level values can also be converted (quantized)
into discrete quantities.
• This is called quantization.
• We have converted the gray level ranges into 4 levels.
• For this we assign one of the 4 discrete gray levels
(closest one) to each sample.
Image to be sampled and quantized
along a scan line
The intensity variation sampled at
regular intervals along the scan line
300

250

200

150

100

50
0 10 20 30 40 50 60 70 80 90 100
Four level intensity quantization of
sampled scan line
260

240

220

200

180

160

140

120

100

80

60
0 10 20 30 40 50 60 70 80 90 100
Representing a Digital Image
Spatial and Gray-Level Resolution
• Sampling determines the spatial resolution
of an image.
• Resolution is the smallest number of
discernible line pairs per unit distance.
• Gray-level resolution is the smallest
discernible change in gray level.
Image to be sub sampled
Size of the image is :512 X 512
Sub sampled images

Size of the image is :256 X 256

Size of the image is :128 X 128


Subsampling
• The subsampling is achieved by deleting
appropriate number of rows and columns from
the original image.
• For example, we can get a 256 X 256 image by
deleting the every other row and column from
the 512 X 512 image.
• In order to see the effect of subsampling, we can
replicate appropriate rows and columns of 256 X
256 to bring it to 512 X 512 size.
• We notice the checkboard pattern.
Size of the image is :512 X 512 Size of the image is :512 X 512

Size of the image is :512 X 512


False contouring
• We can keep the number of samples
constant and reduce the number of gay
levels also from 256 to 128, 64, etc.
• This creates imperceptible set of very fine
ridgelike structures in areas of smooth
gray levels, called false contouring.
• It is prominent in an image which is
displayed with 16 or less gray levels.
Original Image Quantized Image
Zooming and Shrinking Digital Images
• Zooming is viewed as oversampling and shrinking is
viewed as under sampling.
• Zooming is a 2 step process: the creation of new pixel
locations and assignment of gray levels to those new
locations.
• For example, say we want to zoom an image of size 500
X 500 to 750 X 750.
• We can use nearest neighbor interpolation for zooming.
• Pixel replication is the special case of nearest neighbor
interpolation.
• Pixel replication is used to zoom the image by an integer
number of times.
• Here new locations are exact duplicates of old locations.
• It is very fast but produces check board effect and hence
is undesirable for larger magnification.
Shrinking an image
• For shrinking an image by one-half, we delete
every other row and column.
• In order to shrink an image by non integer factor,
we expand the grid to fit over the original image,
do gray-level nearest neighbor or bilinear
interpolation, and then shrink the grid back to its
original specified size. (Assignment).
• It is good to blur an image slightly before
shrinking it.
Image zooming using nearest
neighbor gray-level interpolation
Image size:128X128 Image size:512X512

Image size:256X256
Image Zooming using bilinear interpolation

Image size:128X128 Image size:512X512

Image size:256X256
Relationships between Pixels
Neighbors of a Pixel
• A pixel p at coordinates (x,y) has 4 horizontal and
vertical neighbors whose coordinates are given by
• (x+1,y), (x-1,y), (x,y+1), (x,y-1).
• This set of pixels is called the 4-neighbors of p and
is denoted by N4(p). Some of the neighbors of p lie
outside the image if (x,y) is on the border of the
image.
• The four diagonal neighbors of p have coordinates
• (x+1, y+1), (x+1, y-1), (x-1, y+1), (x-1, y-1)
• And are denoted by ND(p). These points, along with
4-neighbors are called 8-neighbors of p, denoted by
N8(p).
Relationships between Pixels
Neighbors of a Pixel
Adjacency and Connectivity
• To establish if 2 pixels are connected, it must be
determined if they are neighbors and if their gray
levels satisfy a specified criterion of similarity.
• (eg) in a binary image, 2 pixels may be 4
neighbors but they are said to be connected only
if they have the same value.
• In a gray scale image, we consider any subset
of allowed gray level values for connectivity.
3 types of adjacency
• 4-adjacency: Two pixels p and q with values
from V are 4 adjacent if q is in the set N 4(q).
• 8-adjacenty: Two pixels p and q with values
from V are 8-adjacent if q is in the set N 8(q).
• m-adjacency: Two pixels p and q with values
from V are m-adjacent if
– q is in N4(p), or
– q is in ND(p) and set N4(p) ∩ N4(q) has no pixels
whose values are from V.
Adjacency
Adjacency

You might also like