0% found this document useful (0 votes)
651 views

21EC722 - Digital Image Processing M1

Dip lecture notes

Uploaded by

sak33244
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
651 views

21EC722 - Digital Image Processing M1

Dip lecture notes

Uploaded by

sak33244
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 42

21EC732 -

Digital Image
Processing
Dr. Rajashekhargouda C. Patil
Professor, Dept. of ECE,
Jain College of Engineering,
Belagavi

Courtesy: https://www.craftypuzzles.com/blog/picture-worth-1000-words/
Course Details

Course outcomes (Course Skill Set)


At the end of the course, the student will be able to:
1. Understand image formation and the role of human visual system plays in the perception of gray and colour image data.
2. Compute various transforms on digital images.
3. Conduct an independent study and analysis of Image Enhancement techniques.
4. Apply image processing techniques in the frequency (Fourier) domain.
5. Design image restoration techniques.
Syllabus
Module 1:
• Digital Image Fundamentals: What is Digital Image Processing?, Origins of Digital Image Processing,
Examples of fields that use DIP, Fundamental Steps in Digital Image Processing, Components of an Image
Processing System, Elements of Visual Perception, Image Sensing and Acquisition, Image Sampling and
Quantization, Some Basic Relationships Between Pixels.
• [Text 1: Chapter 1, Chapter 2: Sections 2.1 to 2.5]
Module 2:
• Image Transforms: Introduction, Two-Dimensional Orthogonal and Unitary Transforms, Properties of Unitary
Transforms, Two-Dimensional DFT, cosine Transform, Haar Transform.
• Text 2: Chapter 5: Sections 5.1 to 5.3, 5.5, 5.6, 5.9]
Module 3:
• Spatial Domain: Some Basic Intensity Transformation Functions, Histogram Processing, Fundamentals of
Spatial Filtering, Smoothing Spatial Filters, Sharpening Spatial Filters
• [Text 1: Chapter 3: Sections 3.2 to 3.6]
Syllabus (Contd..)
Module 4:
• Frequency Domain: Basics of Filtering in the Frequency Domain, Image Smoothing and Image Sharpening
Using Frequency Domain Filters.
• Color Image Processing: Color Fundamentals, Color Models, Pseudo-color Image Processing.
• [Text 1: Chapter 4: Sections 4.7 to 4.9 and Chapter 6: Sections 6.1 to 6.3]
Module 5:
• Restoration: A model of the Image Degradation/Restoration Process, Noise models, Restoration in the
Presence of Noise Only using Spatial Filtering and Frequency Domain Filtering, Inverse Filtering, Minimum
Mean Square Error (Wiener) Filtering.
• [Text 1: Chapter 5: Sections 5.1, to 5.4.3, 5.7, 5.8]
Suggested Learning Resources:
• Text Books:
1. Digital Image Processing- Rafael C Gonzalez and Richard E Woods, PHI, 3rd Edition 2010.
2. Fundamentals of Digital Image Processing- A K Jain, PHI Learning Private Limited 2014.
• Reference Book:
Digital Image Processing- S Jayaraman, S Esakkirajan, T Veerakumar, Tata McGraw Hill, 2014.
Syllabus (Contd..)
Web links and Video Lectures (e-Resources)
• Image databases, https://imageprocessingplace.com/root_files_V3/image_databases.htm
Student support materials,
• https://imageprocessingplace.com/root_files_V3/students/students.htm
• NPTEL Course, Introduction to Digital Image Processing, https://nptel.ac.in/courses/117105079
• Computer Vision and Image Processing, https://nptel.ac.in/courses/108103174
• Image Processing and Computer Vision – Matlab and Simulink, https://in.mathworks.com/solutions/image-
video-processing.html
Module 1
• Digital Image Fundamentals: What is Digital Image Processing?,
Origins of Digital Image Processing, Examples of fields that use DIP,
Fundamental Steps in Digital Image Processing, Components of an
Image Processing System, Elements of Visual Perception, Image
Sensing and Acquisition, Image Sampling and Quantization, Some Basic
Relationships Between Pixels.

• [Text 1: Chapter 1, Chapter 2: Sections 2.1 to 2.5]


What is Digital Image Processing?
• An image may be defined as a two-dimensional function, f(x, y), where x and y are
spatial (plane) coordinates, and the amplitude of f at any pair of coordinates (x, y) is
called the intensity or gray level of the image at that point. When x, y, and the intensity
values of f are all finite, discrete quantities, we call the image a digital image.

• The field of digital image processing refers to processing digital images by means of a
digital computer.

• Note that a digital image is composed of a finite number of elements, each of which has
a particular location and value. These elements are called picture elements, image
elements, pels, and pixels. Pixel is the term used most widely to denote the elements of a
digital image.
The Origins of Digital Image Processing
• One of the first applications of digital images was in
the newspaper industry, when pictures were first sent
by submarine cable between London and New York.
Introduction of the Bartlane cable picture
transmission system in the early 1920s reduced the
time required to transport a picture across the Atlantic
from more than a week to less than three hours. FIGURE 1.1

Specialized printing equipment coded pictures for


cable transmission and then reconstructed them at the
receiving end.
The Origins of Digital Image Processing Contd..
FIGURE 1.2
• The printing method used to obtain Fig. 1.1 was abandoned
toward the end of 1921 in favor of a technique based on
photographic reproduction made from tapes perforated at the
telegraph receiving terminal. Figure 1.2 shows an image
obtained using this method. The improvements over Fig. 1.1
are evident, both in tonal quality and in resolution.

• The early Bartlane systems were capable of coding images in


five distinct levels of gray. This capability was increased to 15
levels in 1929. Figure 1.3 is typical of the type of images that
FIGURE 1.3
could be obtained using the 15-tone equipment. During this
period, introduction of a system for developing a film plate via
light beams that were modulated by the coded picture tape
improved the reproduction process considerably.
The Origins of Digital Image Processing Contd..
• Although the examples just cited involve digital images, they are not considered digital image
processing results in the context of our definition because computers were not involved in their
creation.

• Thus, the history of digital image processing is intimately tied to the development of the digital computer.

• The idea of a computer goes back to the invention of the abacus in Asia Minor, more than 5000 years ago.
More recently, there were developments in the past two centuries that are the foundation of what we call a
computer today.

• However, the basis for what we call a modern digital computer dates back to only the 1940s with the
introduction by John von Neumann of two key concepts: (1) a memory to hold a stored program and data,
and (2) conditional branching.
The Origins of Digital Image Processing Contd..
1.The invention of the transistor at Bell Laboratories in 1948;

2.The development in the 1950s and 1960s of the high-level programming languages COBOL
(Common Business-Oriented Language) and FORTRAN (Formula Translator);

3.The invention of the integrated circuit (IC) at Texas Instruments in 1958;

4.The development of operating systems in the early 1960s;

5.The development of the microprocessor (a single chip consisting of the central processing unit,
memory, and input and output controls) by Intel in the early 1970s;

6.Introduction by IBM of the personal computer in 1981; and

7.Progressive miniaturization of components, starting with large scale integration (LI) in the late
1970s, then very large scale integration (VLSI) in the 1980s, to the present use of ultra large scale
integration (ULSI).
The Origins of Digital Image Processing Contd..
• The first computers powerful enough to
carry out meaningful image processing
tasks appeared in the early 1960s. The birth
of what we call digital image processing
today can be traced to the availability of
those machines and to the onset of the
space program during that period.

• Figure 1.4 shows the first image of the


moon taken by Ranger 7 on July 31, 1964
at 9:09 A.M FIGURE 1.4
The Origins of Digital Image Processing Contd..
• In parallel with space applications, digital image processing techniques began in the late
1960s and early 1970s to be used in medical imaging, remote

• Earth resources observations, and astronomy.

• The invention in the early 1970s of computerized axial tomography (CAT), also called
computerized tomography

• (CT) for short, is one of the most important events in the application of image
processing in medical diagnosis. Computerized axial tomography is a process in which a
ring of detectors encircles an object (or patient) and an X-ray source, concentric with the
detector ring, rotates about the object.
The Origins of Digital Image Processing Contd..
From the 1960s until the present, the field of image processing has grown vigorously. In addition to applications in
medicine and the space program, digital image processing techniques now are used in a broad range of applications.
Computer procedures are used to enhance the contrast or code the intensity levels into color for easier interpretation of
X-rays and other images used in industry, medicine, and the biological sciences.

Geographers use the same or similar techniques to study pollution patterns from aerial and satellite imagery. Image
enhancement and restoration procedures are used to process degraded images of unrecoverable objects or experimental
results too expensive to duplicate.

In archaeology, image processing methods have successfully restored blurred pictures that were the only available
records of rare artifacts lost or damaged after being photographed.

In physics and related fields, computer techniques routinely enhance images of experiments in areas such as high-
energy plasmas and electron microscopy.

Similarly successful applications of image processing concepts can be found in astronomy, biology, nuclear medicine,
law enforcement, defense, and industry.
Examples of Fields that Use Digital Image Proc.

• Gamma-Ray Imaging X-Ray Imaging


Examples of Fields that Use Digital Image Proc.
Imaging in the Ultraviolet Band Imaging in the Visible and Infrared Bands

FIGURE 1.8 Examples of ultraviolet imaging. FIGURE 1.9 Examples of light microscopy imagesa
(a) Normal corn. (b) Smut corn. (a) Cholesterol (b) Surface of audio CD
Examples of Fields that Use Digital Image Proc.
Imaging in the Visible and Infrared Bands
Examples of Fields that Use Digital Image Proc.
Examples of Fields that Use Digital Image Proc.
Imaging in the
Microwave Band

Mountains in this area reach


about 5800 m (19,000 ft)
above sea level,
while the valley floors lie
about 4300 m (14,000 ft)
above sea level.
Note the clarity and detail of
the image, unencumbered by
clouds or other atmospheric
conditions that normally
interfere with images in the
visual band.
Examples of Fields that Use Digital Image Proc.
Imaging in the Radio Band (MRI in Medical Domain)
Fundamental Steps in Digital Image Proc..
Fundamental Steps in Digital Image Proc..
• Image acquisition is the first process, gives some hints regarding the origin
of digital images.
• Image enhancement is the process of manipulating an image so that the
result is more suitable than the original for a specific application.
• Image restoration is an area that also deals with improving the appearance
of an image.
• Color image processing is an area that has been gaining in importance
because of the significant increase in the use of digital images over the
Internet.
• Wavelets are the foundation for representing images in various degrees of
resolution.
Fundamental Steps in Digital Image Proc..
• Compression, as the name implies, deals with techniques for reducing the
storage required to save an image, or the bandwidth required to transmit it.
• Morphological processing deals with tools for extracting image
components that are useful in the representation and description of shape.
• Segmentation procedures partition an image into its constituent parts or
objects.
• Representation and description almost always follow the output of a
segmentation stage, which usually is raw pixel data, constituting either the
boundary of a region (i.e., the set of pixels separating one image region
from another) or all the points in the region itself.
• Recognition is the process that assigns a label (e.g., “vehicle”) to an object
based on its descriptors.
Components of an Image Processing System
Elements of Visual Perception
The central opening of the iris (the pupil) varies in
diameter from approximately 2 to 8 mm.

The lens is made up of concentric layers of fibrous


cells and is suspended by fibers that attach to the
ciliary body. It contains 60 to 70% water, about 6%
fat, and more protein than any other tissue in the
eye. The lens absorbs approximately 8% of the
visible light spectrum, with relatively higher
absorption at shorter wavelengths.

The innermost membrane of the eye is the retina,


which lines the inside of the wall’s entire posterior
portion. When the eye is properly focused, light
from an object outside the eye is imaged on the
retina.
Elements of Visual Perception
Rods and Cones
There are two classes of receptors: cones and rods.
The cones in each eye number between 6 and 7 million.They are located primarily in the
central portion of the retina, called the fovea, and are highly sensitive to color. Humans can
resolve fine details with these cones largely because each one is connected to its own nerve
end. Muscles controlling the eye rotate the eyeball until the image of an object of interest falls
on the fovea. Cone vision is called photopic or bright-light vision.
The number of rods is much larger: Some 75 to 150 million are distributed over the retinal
surface.The larger area of distribution and the fact that several rods are connected to a single
nerve end reduce the amount of detail discernible by these receptors. Rods serve to give a
general, overall picture of the field of view.They are not involved in color vision and are
sensitive to low levels of illumination. For example, objects that appear brightly colored in
daylight when seen by moonlight appear as colorless forms because only the rods are
stimulated.This phenomenon is known as scotopic or dim-light vision.
Elements of Visual Perception
• Distribution of rods and cones in the retina.
Image Formation in the Eye
Brightness Adaptation and Discrimination
Module 1

FIGURE 2.6
Typical Weber ratio as a function of intensity.
Module 1

FIGURE 2.8
Examples of simultaneous contrast. All the inner squares
have the same intensity, but they appear progressively
darker as the background becomes lighter.

FIGURE 2.7 →
Illustration of the Mach band effect. Perceived intensity is
not a simple function of actual intensity.
Image Sensing and Acquisition

FIGURE 2.13 →
FIGURE 2.12
Combining a
(a) Single imaging sensor.
single sensor with
(b) Line sensor. motion to
(c) Array sensor. generate a 2-D
image.
Image Acquisition Using Sensor Strips

FIGURE 2.14
(a) Image acquisition using a linear sensor strip.
(b) Image acquisition using a circular sensor strip.
Image Acquisition Using Sensor Arrays

FIGURE 2.15 An example of the digital image


acquisition process.
(a) Energy (“illumination”) source.
(b) An element of a scene.
(c) Imaging system.
(d) Projection of the scene onto the image
plane.
(e) Digitized image.
Image Sampling and Quantization
Basic Concepts in Sampling
and Quantization

FIGURE 2.16
Generating a digital image.
(a) Continuous image.
(b) A scan line from A to B in the
continuous image, used to
illustrate the concepts of
sampling and quantization.
(c) Sampling and quantization.
(d) Digital scan line.
Module 1
FIGURE 2.17
(a) Continuous
image projected
onto a sensor
array.
(b) Result of
image sampling
and
quantization.
Representing Digital Images
Module 1
This digitization process requires that decisions be made regarding the values
for M, N, and for the number, L, of discrete intensity levels. There are no
restrictions placed on M and N, other than they have to be positive integers.

L = 2k

The number, b, of bits required to store a digitized image is

b=M*N*k
Module 1

FIGURE 2.20 Typical effects of reducing spatial resolution. Images shown at:
(a) 1250 dpi, (b) 300 dpi, (c) 150 dpi, and (d) 72 dpi.
Some Basic Relationships between Pixels
Neighbors of a Pixel

LRTB Neighbors: (x + 1, y), (x - 1, y), (x, y + 1), (x, y - 1)

Diagonal neighbors: (x + 1, y + 1), (x + 1, y - 1), (x - 1, y + 1), (x - 1, y - 1)

Adjacency, Connectivity, Regions, and Boundaries

Let V be the set of intensity values used to define adjacency. In a binary image, V={1}, if we
are referring to adjacency of pixels with value 1. In a gray-scale image, the idea is the same,
but set V typically contains more elements.For example, in the adjacency of pixels with a
range of possible intensity values 0 to 255, set V could be any subset of these 256 values.We
consider three types of adjacency:
(a) 4-adjacency.Two pixels p and q with values from V are 4-adjacent if q is in the set N4(p)
(b) 8-adjacency.Two pixels p and q with values from V are 8-adjacent if q is in the set N8(p)
(c) m-adjacency (mixed adjacency).Two pixels p and q with values from V are m-adjacent if
(i) q is in N4(p) or
(ii) q is in ND(p) and the set N4(p) N4(p) has no pixels whose values are from V.
Module 1
DIP: Lena Söderberg and Cameraman
Courtesy:
https://www.sbs.com.au/news/article/the-
campaign-to-erase-the-image-that-
accidentally-helped-entrench-sexism-in-
tech/0zboc3bbp

Courtesy:
https://pursuit.unimelb.edu.au/
articles/it-s-time-to-retire-lena-
from-computer-science

You might also like