0% found this document useful (0 votes)
8 views

Group1_Presentation

Uploaded by

mr.max533
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

Group1_Presentation

Uploaded by

mr.max533
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 75

Digital Image Processing

Group 1: Presentation

Chapter 1 Chapter 2
Introduction Digital Image Fundamentals
Contents
What is a digital What is digital
01 image? image processing 02
State of the art Key stages in digital
03 examples of DIP image processing 04

Components of a Elements of Visual


05 DIP Perception 06
Image sensing and Image sampling and
07 acquisition quantization 08
Relationships Electromagnetic
09 between pixels Spectrum 10
What is a Digital Image?
A digital image is a representation of a two-dimensional
image as a finite set of digital values, called picture elements
or pixels
What is a Digital Image? (cont…)
Pixel values typically represent gray levels, colours, heights,
opacities etc
Remember digitization implies that a digital image is an
approximation of a real scene
1 pixel
What is a Digital Image? (cont…)
Common image formats include:
 1 sample per point (B&W or Grayscale)
 3 samples per point (Red, Green, and Blue)
 4 samples per point (Red, Green, Blue, and “Alpha”, a.k.a. Opacity)

For most of this course we will focus on grey-scale images


What is Digital Image Processing?
 It is the manipulation of the digital data with the help of
computer hardware and software to produce digital maps in
which the specific information has been extracted and
highlighted
 Digital image processing focuses on two major tasks
 Improvement of pictorial information for human interpretation
 Processing of image data for storage, transmission and
representation for autonomous machine perception
What is DIP? (cont…)
The continuum from image processing to computer vision can
be broken up into low-, mid- and high-level processes
Low Level Process Mid Level Process High Level Process
Input: Image Input: Image Input: Attributes Output:
Output: Image Output: Attributes Understanding
Examples: Noise removal, Examples: Object recognition, Examples: Scene
image sharpening segmentation understanding, autonomous
navigation

In this course we will stop here


How DIP works
State of the art examples of DIP
Biometrics Color processing

Medical Application Face Recognition

Target Recognition Video processing

Remote sensing Microscopic Imaging

Interpretation of Aerial Photography Traffic Monitoring

Autonomous Vehicles Indexing into Databases


Biometrics
Indexing into Databases
Shape content
Target Recognition
Department of Defense (Army, Airforce, Navy)
Interpretation of Aerial Photography
Interpretation of aerial photography is a problem domain in both computer vision and
registration.
Autonomous Vehicles
Land, Underwater, Space
Traffic Monitoring
Face Recognition
Video Processing
Medical Applications

Skin cancer detection


Gamma ray imaging
PET scan
X Ray Imaging
Medical CT
UV imaging
Breast cancer detection
Key Stages in Digital Image Processing
Morphological
Image Restoration
Processing

Image Enhancement Segmentation

Image Acquisition Object Recognition

Representation &
Problem Domain
Description
Colour Image
Image Compression
Processing
Key Stages in Digital Image Processing:
Image Aquisition
Morphological
Image Restoration
Processing

Image Enhancement Segmentation

Image Acquisition Object Recognition

Representation &
Problem Domain
Description
Colour Image
Image Compression
Processing
Key Stages in Digital Image Processing:
Image Enhancement
Morphological
Image Restoration
Processing

Image Enhancement Segmentation

Image Acquisition Object Recognition

Representation &
Problem Domain
Description
Colour Image
Image Compression
Processing
Key Stages in Digital Image Processing:
Image Restoration
Morphological
Image Restoration
Processing

Image Enhancement Segmentation

Image Acquisition Object Recognition

Representation &
Problem Domain
Description
Colour Image
Image Compression
Processing
Key Stages in Digital Image Processing:
Morphological Processing
Morphological
Image Restoration
Processing

Image Enhancement Segmentation

Image Acquisition Object Recognition

Representation &
Problem Domain
Description
Colour Image
Image Compression
Processing
Key Stages in Digital Image Processing:
Segmentation
Morphological
Image Restoration
Processing

Image Enhancement Segmentation

Image Acquisition Object Recognition

Representation &
Problem Domain
Description
Colour Image
Image Compression
Processing
Key Stages in Digital Image Processing:
Object Recognition
Morphological
Image Restoration
Processing

Image Enhancement Segmentation

Image Acquisition Object Recognition

Representation &
Problem Domain
Description
Colour Image
Image Compression
Processing
Key Stages in Digital Image Processing:
Representation & Description
Morphological
Image Restoration
Processing

Image Enhancement Segmentation

Image Acquisition Object Recognition

Representation &
Problem Domain
Description
Colour Image
Image Compression
Processing
Key Stages in Digital Image Processing:
Image Compression
Morphological
Image Restoration
Processing

Image Enhancement Segmentation

Image Acquisition Object Recognition

Representation &
Problem Domain
Description
Colour Image
Image Compression
Processing
Key Stages in Digital Image Processing:
Colour Image Processing
Morphological
Image Restoration
Processing

Image Enhancement Segmentation

Image Acquisition Object Recognition

Representation &
Problem Domain
Description
Colour Image
Image Compression
Processing
Components of a DIP
Image Sensors
Specialized Image Processing Hardware
Specialized Image Processing Software
Computer
Mass Storage
Image Display
Hard Copy Device
Network
Image Sensors

Refers to sensing.
Have two elements that are required to capture digital
images.
First is a physical device (sensor) that is sensitive to the
energy radiated by the object and convert in to image.
Second is a digitizer that is used for converting the
output of a physical sensing device into digital form.
CCD (Charged Coupled Device) and CMOS
(Complementary Metal-Oxide Conductor) are widely
used in image sensors.
Specialized Image Processing Hardware

Consists of the digitizer and hardware.


Performs primitive operations, such as an Arithmetic Logic
Unit (ALU).
ALU help to performs arithmetic and logical operations in
parallel on entire images.
Specialized Image Processing Software

Consists of specialized modules that


perform specific tasks.
A well-designed package also includes
the capability for the user to write code.
Example: Photoshop, GIMP(GNU Image
Manipulation Program), Fireworks,
Pixelmator and Inkscape.
Computer

The computer in an image processing


system is a general-purpose computer.
Range from a PC to a supercomputer.
Specially designed computers are used to
achieve a required level of performance.
Mass Storage

Refers to the storage of a large amount of data in persisting and


machine-readable fashion.
The mass storage capability is a must in image processing
applications.
Three principal categories of mass storage:
 Short-term storage for use during processing.
 On line storage for relatively fast recall.
 Archival storage, characterized by infrequent access.
Image Display

Is the final link in the digital


image processing chain.
Image displays are mainly
colored TV monitors.
Hard Copy Device

Various devices for recording images.


Laser printers, film cameras, heat-
sensitive devices, and digital units,
such as optical and CD-ROM disks.
Network

Key parameter is the bandwidth.


It is a required component to transmit image information
over a networked computer
Elements of Visual Perception
Structure of Eye
Image Formation in the Eye
Brightness Adaptation and Discrimination
Structure of Eye
Image Formation in the Eye
Brightness Adaptation and Discrimination
Sampling, Quantisation And Resolution

In the following slides we will consider what is involved in


capturing a digital image of a real-world scene
 Image sensing and representation
 Sampling and quantisation

 Resolution
Image Representation
Image Acquisition
Images are typically generated by illuminating a scene and
absorbing the energy reflected by the objects in that scene
– Typical notions of
illumination and scene
can be way off:
• X-rays of a skeleton
• Ultrasound of an
unborn baby
• Electro-microscopic
images of molecules
Image Sensing
Incoming energy lands on a sensor material responsive to
that type of energy and this generates a voltage
Collections of sensors are arranged to capture images

Imaging Sensor

Line of Image Sensors

Array of Image Sensors


Image Sampling And Quantisation
A digital sensor can only measure a limited number of
samples at a discrete set of energy levels
Quantisation is the process of converting a continuous
analogue signal into a digital representation of this signal
Image Sampling And Quantisation
Image Sampling And Quantisation (cont…)

Remember that a digital image is always only an


approximation of a real world scene
Image Representation
Spatial Resolution
The spatial resolution of an image is determined by how
sampling was carried out
Spatial resolution simply refers to the smallest discernable
detail in an image
 Vision specialists will
often talk about pixel
size
 Graphic designers will
talk about dots per
inch (DPI)
Spatial Resolution (cont…)
Intensity Level Resolution
Intensity level resolution refers to the number of intensity levels used to
represent the image
 The more intensity levels used, the finer the level of detail discernible in an
image
 Intensity level resolution is usually given in terms of the number of bits
used to store each intensity level
Number of Intensity
Number of Bits Examples
Levels
1 2 0, 1
2 4 00, 01, 10, 11
4 16 0000, 0101, 1111
8 256 00110011, 01010101
16 65,536 1010101010101010
Saturation & Noise
 saturation is the purity of a color.
 Saturation is a very important aspect in
photography, perhaps as important as
contrast.
 In addition to our eyes being naturally
attracted to vibrant tones, colors have
their own unique way of telling a story
that plays a crucial part in making a
photograph.
Saturation & Noise

 Image noise is random variation of


brightness or color
information in images.
 it is usually an aspect of electronic
noise.
 It can be produced by the image
sensor and circuitry of a scanner or
digital camera.
Basic Relationships Between Pixels
❑ Neighborhood
❑ Adjacency
❑ Connectivity
❑ Paths
❑ Regions and boundaries
Neighbors of a Pixel
⮚ Any pixel p(x, y) has two vertical and two horizontal neighbors,
given by (x+1, y), (x-1, y), (x, y+1), (x, y-1)
⮚ This set of pixels are called the 4-neighbors of P, and is denoted by
N4(P).
⮚ Each of them are at a unit distance from P.
Neighbors of a Pixel (Contd..)
⮚ The four diagonal neighbors of p(x,y) are given by, (x+1, y+1),
(x+1, y-1), (x-1, y+1), (x-1 ,y-1)
⮚ This set is denoted by ND(P).
⮚ Each of them are at Euclidean distance of 1.414 from P.
⮚ The points ND(P) and N4(P) are together known as 8-neighbors of
the point P, denoted by N8(P).
⮚ Some of the points in the N4, ND and N8 may fall outside image
when P lies on the border of image.
Neighbors of a Pixel (Contd..)
a. Neighbors of a pixel 4-neighbors of a
pixel p are its vertical and horizontal
neighbors denoted by N4(p).

b. 8-neighbors of a pixel p are its vertical


horizontal and 4 diagonal neighbors p
denoted by N8(p).
N8(p)
Neighbors of a Pixel (Contd..)

ND N4 ND

N4 P N4

ND N4 ND
❑ N4 - 4-neighbors
❑ ND - diagonal neighbors
❑ N8 - 8-neighbors (N4 U ND)
Adjacency
❑Two pixels are connected if they are neighbors and their
gray levels satisfy some specified criterion of similarity.

❑For example, in a binary image two pixels are connected if


they are 4-neighbors and have same value (0/1).
Adjacency (contd.)
⮚ Let V be set of gray levels values used to define adjacency.
⮚ 4-adjacency: Two pixels p and q with values from V are 4-adjacent if
q is in the set N4(p).
⮚ 8-adjacency: Two pixels p and q with values from V are 8-adjacent if
q is in the set N8(p).
⮚ m-adjacency: Two pixels p and q with values from V are madjacent if,
– q is in N4(P).
– q is in ND(p) and the set [N4(p)IN4(q)] is empty (has no pixels
whose values are from V).
Connectivity :
❑ To determine whether the pixels are adjacent V = {1, 2}
in some sense. 0 1 1
❑ Let V be the set of gray-level values used to 0 2 0
define connectivity; then Two pixels p, q that 0 0 1
have Values from the set V are: 0 1 1
a. 4-connected, if q is in the set N4(p) 0 2 0
b. 8-connected, if q is in the set N8(p) 0 0 1
c. m-connected, iff 0 1 1
i. q is in N4(p) or 0 2 0
ii. q is in ND(p) and the set N4(p)IN4(q) is empty.
0 0 1
Adjacency/Connectivity
Adjacency/Connectivity
❑ Pixel p is adjacent to pixel q if they are connected.
❑ Two image subsets S1 and S2 are adjacent if some pixel in S1 is
adjacent to some pixel in S2

S 1

S 2
Paths & Path lengths
❖A path from pixel p with coordinates (x, y) to pixel q with
coordinates (s, t) is a sequence of distinct pixels with coordinates:
(x0, y0), (x1, y1), (x2, y2) … (xn, yn), where (x0, y0)=(x, y) and
(xn, yn)=(s, t); (xi, yi) is adjacent to (xi-1, yi-1) 1≤ i ≤ n

⮚ Here n is the length of the path.


⮚ We can define 4-, 8-, and m-paths based on type of adjacency used.
Connected Components
❑ If p and q are pixels of an image subset S then p is connected to q in
S if there is a path from p to q consisting entirely of pixels in S.

❑ For every pixel p in S, the set of pixels in S that are connected to p is


called a connected component of S.

❑ If S has only one connected component then S is called Connected


Set.
Regions and Boundaries
❑ A subset R of pixels in an image is called a Region of the image if R
is a connected set.

❑ The boundary of the region R is the set of pixels in the region that
have one or more neighbors that are not in R.

❑ If R happens to be entire Image?


Distance measures
Given pixels p, q and z with coordinates (x, y), (s, t), (u, v)
respectively, the distance function D has following properties:

a. D(p, q) ≥ 0 [D(p, q) = 0, iff p = q]


b. D(p, q) = D(q, p)
c. D(p, z) ≤ D(p, q) + D(q, z)
❑ The following are the different Distance measures:
⮚ Euclidean Distance :
De(p, q) = [(x-s)2 + (y-t)2]

⮚ City Block Distance:


D4(p, q) = |x-s| + |y-t|

⮚ Chess Board Distance:


D8(p, q) = max(|x-s|, |y-t|)
Neighborhood based arithmetic/Logic :
Value assigned to a pixel at position ‘e’ is a function of its neighbors
and a set of window functions.
Arithmetic/Logic Operations
Tasks done using neighborhood processing:

Smoothing / averaging.
Noise removal / filtering.
Edge detection.
Contrast enhancement.
Light And The Electromagnetic Spectrum
❑ Light is just a particular part of the electromagnetic spectrum that
can be sensed by the human eye.
❑ The electromagnetic spectrum is split up according to the
wavelengths of different forms of energy.
Types of Electromagnetic Radiation:
Supervised By

Sohely Jahan
Lecturer,
Department Of Computer Science and Engineering
University of Barishal

Presented By

Group (1)

Md.Saimun Islam 16CSE010 Md. Showkat Imam 16CSE029


Md. Kamruzzaman 14CSE037 Md. Jane Alam 16CSE005
Md. Foysal Sheikh 16CSE035

You might also like