Dr.
Justin Varghese Slide 1
19ECS352: IMAGE PROCESSING
Digital Image Processing
Justin Varghese, Ph.D. (Eng.), SM IEEE
Professor,
Department of Computer Science & Engineering,
Gitam (Deemed to be University)
GST, Bangalore Campus
Mobile: +91-9940955156
Email: jvarghes@[Link], justin_var@[Link]
Department of Computer Science & Engineering
CSEN2131 COMPUTER GRAPHICS. Dr. Justin Varghese Slide 2
CSEN2131 COMPUTER GRAPHICS
• Assessment Scheme:
Assessment Assessment Method (Mid, quiz, Weight (% of the total Tentative date of conducting/
No. assignment, case study, mini-project) marks for the course) announcement/ submission
1 Quiz 1 4% 22nd July, 2024
2 Assignment 1 5% Announcement:1st Aug, 24
Submission: 8th Aug,24
3 Quiz 2 4% 12th Aug, 2024
4 Quiz 3 4% 2nd Sep, 2024
5 Case Study / Mini Project 10% Announcement:4th Sep, 24
Submission: 27th Sep, 24
6 Quiz 4 4% 30th Sep, 2024
7 Assignment 2 5% Announcement:14th Oct, 24
Submission: 21st Oct, 24
8 Quiz 5 4% 25th Oct, 2024
9 Mid Term 30% 9th Sep, 2024
10 End Semester 30% 8th Nov, 2024
Department of Computer Science & Engineering-Bangalore Campus
CSEN2131 COMPUTER GRAPHICS. Dr. Justin Varghese Slide 3
CSEN2131 COMPUTER GRAPHICS
Mapping of course outcomes with learning activities and assessments
COs Learning Continuous Evaluation Components * % End End
activities Sem Term
Q1 Q2 A1 Q3 Mid Q4 A2 Q5 CS/M *%
P
CO 1 L/Q/P/T 6
4 10 20
CO 2 L/Q/P/T 6
4 10 20
CO 3 L/Q/P/T 6
4 10 20
CO 4 L/Q/P/T 4 5 6
5 20
CO 5 L/Q/P/T 5 4 5 6 20
L=Lecture, T=Thought Process, P=Presentation, Q=Quiz, A=Assignment, Mid=Test,
CS=Case Study, MP=Mini Project
Department of Computer Science & Engineering-Bangalore Campus
Dr. Justin Varghese Slide 4
19ECS352: IMAGE PROCESSING
An Introduction to the Domain.
The IT Revolution
The start..
"What Hath God Wrought"
-the first transmission of Samuel Morse’s Telegraph
the radio, the telephony,
the television,
mobile phones, internet, multimedia
These means provide information relevant to the people of all varied
fields in the form of images
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 5
19ECS352: IMAGE PROCESSING
But, IMAGES speak more than words…
Satellite image, a geography, MRI Scan, fingerprints
etc are solid data for the corresponding experts,
To explore, illustrate, diagnose, identify….
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 6
19ECS352: IMAGE PROCESSING
When problems arise in this vital piece of information,
the solution-?
The facilities of Digital Computer got exploited to process the images
accordingly.
Hence Digital Image Processing
To get processed in digital computers, image digitization became important
for
[Link] results
[Link] storage
[Link] transmission
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 7
19ECS352: IMAGE PROCESSING
Image Processing-applications
The Vision based applications are not limited to the following:
1. Robotics
In designing robots, Computer Vision helps in
➢ Avoiding obstacles.
➢ Finding the path for movement.
➢ Classifying people and objects.
➢ To hold and work with objects
Robot handling objects [6]
2. Medical diagnosis and treatment
In medicine, computer vision systems are designed for
➢ Acquisition and Display of images of different body
parts/organs for diagnosis and surgery.
➢ Identifying presence of tumors of different
kinds.
➢ Diagnosis by comparison with similar cases.
Tumor location [7]
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 8
19ECS352: IMAGE PROCESSING
3. Automated inspection of products in industries
Also called Machine vision systems are used for
➢ Identifying or labeling defects in products.
➢ Sorting products of different categories.
Product inspection [8]
4. Surveillance
Images from Surveillance cameras around us /on us /in our vehicles
are used for
➢ Identifying /recognition of the objects
surveyed(vehicle/people/products).
➢ Counting the number of objects.
➢ Identifying obstacles in the way of vehicles.
Crowd tracking [9]
5. Biometric recognitions
Person identification [9]
➢ Identifying features of biometrics like face, iris, finger/palm
prints.
➢ Matching with similar cases and recognizing the connected
person.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 9
19ECS352: IMAGE PROCESSING
Basic image file formats
Image Format describes how data related to the image will be stored.
Data can be stored in compressed, Uncompressed, or vector format. Each
format of the image has a different advantage and disadvantage. Image
types such as TIFF are good for printing while JPG or PNG, are best for the
web.
•TIFF(.tif, .tiff) Tagged Image File Format this format store image data
without losing any data. It does not perform any compression on images,
and a high-quality image is obtained but the size of the image is also large,
which is good for printing, and professional printing.
•JPEG (.jpg, .jpeg) Joint Photographic Experts Group is a loss-prone (lossy)
format in which data is lost to reduce the size of the image. Due to
compression, some data is lost but that loss is very less. It is a very
common format and is good for digital cameras, nonprofessional prints, E-
Mail, Powerpoint, etc., making it ideal for web use.
•GIF (.gif) GIF or Graphics Interchange Format files are used for web
graphics. They can be animated and are limited to only 256 colors, which
can allow for transparency. GIF files are typically small in size and are
portable.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 10
19ECS352: IMAGE PROCESSING
•PNG (.png) PNG or Portable Network Graphics files are a lossless
image format. It was designed to replace gif format as gif supported
256 colors unlike PNG which support 16 million colors.
•Bitmap (.bmp) Bit Map Image file is developed by Microsoft for
windows. It is same as TIFF due to lossless, no compression property.
Due to BMP being a proprietary format, it is generally recommended to
use TIFF files.
•EPS (.eps) Encapsulated PostScript file is a common vector file type.
EPS files can be opened in applications such as Adobe Illustrator or
CorelDRAW.
•RAW Image Files (.raw, .cr2, .nef, .orf, .sr2) These Files are
unprocessed and created by a camera or scanner. Many digital SLR
cameras can shoot in RAW, whether it be a .raw, .cr2, or .nef. These
images are the equivalent of a digital negative, meaning that they hold
a lot of image information. These images need to be processed in an
editor such as Adobe Photoshop or Lightroom. It saves metadata and is
used for photography.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 11
19ECS352: IMAGE PROCESSING
2. Digital Image Processing (DIP) for Computer Vision
Almost all Computer Vision based Systems apply Digital Image Processing
algorithms in almost all stages to achieve the goal.
3. Fundamental areas of DIP
The typical areas of Digital Image Processing are:
• Image Acquisition and Digitization
• Enhancement
• Restoration
• Compression
• Segmentation
• Representation
• Object Recognition
Department of Computer Science & Engineering
Dr. D. REGAN, SISTK 15
•
Dr. D. REGAN, SISTK 16
•
Dr. D. REGAN, SISTK 17
•
Dr. D. REGAN, SISTK 18
•
Dr. D. REGAN, SISTK 19
•
Dr. D. REGAN, SISTK 20
•
Dr. D. REGAN, SISTK 21
•
Dr. D. REGAN, SISTK 22
•
Dr. D. REGAN, SISTK 23
•
Dr. D. REGAN, SISTK 24
•
Dr. D. REGAN, SISTK 25
Dr. Justin Varghese Slide 23
19ECS352: IMAGE PROCESSING
Components of an Image Processing System
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 24
19ECS352: IMAGE PROCESSING
Elements of Visual Perception
The field of digital image processing is built on the foundation of
mathematical and probabilistic formulation, but human intuition
and analysis play the main role to make the selection between
various techniques, and the choice or selection is basically made on
subjective, visual judgements.
In human visual perception, the eyes act as the sensor or camera,
neurons act as the connecting cable and the brain acts as the
processor. The basic elements of visual perceptions are:
[Link] of Eye
[Link] Formation in the Eye
[Link] Adaptation and Discrimination
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 25
19ECS352: IMAGE PROCESSING
Elements of Human Visual Perception
Striucture of Human Eye
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 26
19ECS352: IMAGE PROCESSING
The human eye is a slightly asymmetrical sphere with an average diameter of the
length of 20mm to 25mm. It has a volume of about 6.5cc. The eye is just like a
camera. The external object is seen as the camera take the picture of any object.
Light enters the eye through a small hole called the pupil, a black looking
aperture having the quality of contraction of eye when exposed to bright light
and is focused on the retina which is like a camera film.
The lens, iris, and cornea are nourished by clear fluid, know as anterior chamber.
The fluid flows from ciliary body to the pupil and is absorbed through the
channels in the angle of the anterior chamber. The delicate balance of aqueous
production and absorption controls pressure within the eye.
Cones in eye number between 6 to 7 million which are highly sensitive to colors.
Human visualizes the colored image in daylight due to these cones. The cone
vision is also called as photopic or bright-light vision.
Rods in the eye are much larger between 75 to 150 million and are distributed
over the retinal surface. Rods are not involved in the color vision and are
sensitive to low levels of illumination.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 27
19ECS352: IMAGE PROCESSING
The Human Eye
• Diameter: 20 mm
• 3 membranes enclose the eye
– Cornea & sclera
– Choroid
– Retina
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 28
19ECS352: IMAGE PROCESSING
The Choroid
• The choroid contains blood vessels for eye
nutrition and is heavily pigmented to reduce
extraneous light entrance and backscatter.
• It is divided into the ciliary body and the iris
diaphragm, which controls the amount of
light that enters the pupil (2 mm ~ 8 mm).
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 29
19ECS352: IMAGE PROCESSING
The Lens
• The lens is made up of fibrous cells and is
suspended by fibers that attach it to the
ciliary body.
• It is slightly yellow and absorbs approx. 8%
of the visible light spectrum.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 30
19ECS352: IMAGE PROCESSING
The Retina
• The retina lines the entire posterior portion.
• Discrete light receptors are distributed over
the surface of the retina:
– cones (6-7 million per eye) and
– rods (75-150 million per eye)
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 31
19ECS352: IMAGE PROCESSING
Cones
• Cones are located in the fovea and are
sensitive to color.
• Each one is connected to its own nerve end.
• Cone vision is called photopic (or bright-
light vision).
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 32
19ECS352: IMAGE PROCESSING
Rods
• Rods are giving a general, overall picture of
the field of view and are not involved in
color vision.
• Several rods are connected to a single nerve
and are sensitive to low levels of
illumination (scotopic or dim-light vision).
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 33
19ECS352: IMAGE PROCESSING
Receptor Distribution
• The distribution of receptors is radially
symmetric about the fovea.
• Cones are most dense in the center of the
fovea while rods increase in density from
the center out to approximately 20% off
axis and then decrease.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 34
19ECS352: IMAGE PROCESSING
Cones & Rods
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 35
19ECS352: IMAGE PROCESSING
The Fovea
• The fovea is circular (1.5 mm in diameter)
but can be assumed to be a square sensor
array (1.5 mm x 1.5 mm).
• The density of cones: 150,000
elements/mm2 ~ 337,000 for the fovea.
• A CCD imaging chip of medium resolution
needs 5 mm x 5 mm for this number of
elements
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 36
19ECS352: IMAGE PROCESSING
Image Formation in the Eye
• The eye lens (if compared to an optical
lens) is flexible.
• It gets controlled by the fibers of the ciliary
body and to focus on distant objects it gets
flatter (and vice versa).
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 37
19ECS352: IMAGE PROCESSING
Image Formation in the Eye
• Distance between the center of the lens and
the retina (focal length):
– varies from 17 mm to 14 mm (refractive power
of lens goes from minimum to maximum).
• Objects farther than 3 m use minimum
refractive lens powers (and vice versa).
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 38
19ECS352: IMAGE PROCESSING
Image Formation in the Eye
When the lens of the eye focus an image of the outside world onto a
light-sensitive membrane in the back of the eye, called retina the image
is formed. The lens of the eye focuses light on the photoreceptive cells
of the retina which detects the photons of light and responds by
producing neural impulses.
15 x
=
100 17
x = 2.55mm
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 39
19ECS352: IMAGE PROCESSING
Image Formation in the Eye
• Perception takes place by the relative
excitation of light receptors.
• These receptors transform radiant energy
into electrical impulses that are ultimately
decoded by the brain.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 40
19ECS352: IMAGE PROCESSING
Brightness Adaptation &
Discrimination
• Digital images are displayed as a discrete set of intensities.
The eyes ability to discriminate black and white at
different intensity levels is an important consideration in
presenting image processing result.
• Range of light intensity levels to which HVS (human
visual system) can adapt: on the order of 1010.
• Subjective brightness (i.e. intensity as perceived by the
HVS) is a logarithmic function of the light intensity
incident on the eye.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 41
19ECS352: IMAGE PROCESSING
Brightness Adaptation &
Discrimination
• The HVS cannot operate over such a range
simultaneously.
• For any given set of conditions, the current
sensitivity level of HVS is called the
brightness adaptation level.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 42
19ECS352: IMAGE PROCESSING
Brightness Adaptation &
Discrimination
• The eye also discriminates between changes
in brightness at any specific adaptation
level. I c
→ Weber ratio
I
Where: Ic: the increment of illumination
discriminable 50% of the time and
I : background illumination
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 43
19ECS352: IMAGE PROCESSING
Brightness Adaptation &
Discrimination
• Small values of Weber ratio mean good
brightness discrimination (and vice versa).
• At low levels of illumination brightness
discrimination is poor (rods) and it
improves significantly as background
illumination increases (cones).
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 44
19ECS352: IMAGE PROCESSING
Perceived Brightness
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 45
19ECS352: IMAGE PROCESSING
Simultaneous Contrast
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 46
19ECS352: IMAGE PROCESSING
Image Digitization
• Image Acquisition is involved in getting a natural image.
• Acquisition is done by a physical device that is sensitive to the light from the
object we wish to image.
In digital camera, the sensors produce an electrical output proportional
to light intensity.
• The object of interest to be captured and processed is illuminated by white
light or infrared or ultraviolet or X-ray.
• Reflected responses from all the spatial positions of the object are caught by
a sensor, a CCD or a Vidicon camera and transformed into the equivalent
analog electrical signals by a photoelectric detector in the imaging system.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 47
19ECS352: IMAGE PROCESSING
Digital Image Representation
• A digital image is an image f(x,y) that has been
digitized both in spatial coordinates and
brightness.
• the value of f at any point (x,y) is proportional to
the brightness (or gray level) of the image at that
point.
• A digital image can be considered a matrix whose
row and column indices identify a point in the
image and the corresponding matrix element
value identifies the gray level at that point.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 48
19ECS352: IMAGE PROCESSING
Image acquisition [10]
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 49
19ECS352: IMAGE PROCESSING
Image is a 2D light intensity function f(x,y)
defined by
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 50
19ECS352: IMAGE PROCESSING
• The Image Digitization is performed by a device for converting the
analog output of the physical sensing device into digital form.
A digitizer has two parts: Sampler and Quantizer.
• The process of digitizing the spatial coordinates of the image is
sampling.
• The process of digitizing the amplitude value of the image is
quantization.
Digitization example [10]
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 51
19ECS352: IMAGE PROCESSING
Image Digitization
Sampling
• sampling = the spacing of discrete values in the domain of a signal.
• sampling-rate = how many samples are taken per unit of each dimension.
e.g., samples per second, frames per second, etc.
Quantization
• Quantization = spacing of discrete values in the range of a signal.
• usually thought of as the number of bits per sample of the signal. e.g., 1 bit
per
pixel (b/w images), 16-bit audio, 24-bit color images, etc.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 52
19ECS352: IMAGE PROCESSING
• The digital samples resulting from both sampling and quantization
produce the 2D digital image as a matrix of real numbers.
Digital image representation [10]
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 53
19ECS352: IMAGE PROCESSING
Formation of Digital Image
Light
source
Analog
electrical
Normal Optical sensor & signals Sampler &
Photoelectric Quantizer
Detector
Light reflected
from image, f
Block Digital Image Formation
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 54
19ECS352: IMAGE PROCESSING
Digital Image Representation
cont..
Pixel values in highlighted region
Camera Digitizer A set of number in 2D grid
Samples the analog data and digitizes it.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 55
19ECS352: IMAGE PROCESSING
Example of Digital Image
Continuous image projected onto a sensor array
Result of image sampling and quantization
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 56
19ECS352: IMAGE PROCESSING
Light-intensity function
• image refers to a 2D light-intensity function, f(x,y)
• the amplitude of f at spatial coordinates (x,y) gives the intensity (brightness) of the
image at that point.
• light is a form of energy thus f(x,y) must be nonzero and finite.
<0 f ( x, y) < ∞
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 57
19ECS352: IMAGE PROCESSING
Illumination and Reflectance
the basic nature of f(x,y) may be characterized by 2 components:
the amount of source light incident on the scene being viewed Ö Illumination,i(x,y)
the amount of light reflected by the objects in the scene Ö Reflectance, r(x,y)
f ( x, y) = i( x,y)r ( x, y)
0 < i( x, y) < ∞
determined by the nature of the light source 0 < r ( x, y) < 1
determined by the nature of the objects in a scene Ö bounded from total
absorption to total reflectance.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 58
19ECS352: IMAGE PROCESSING
Gray level
• we call the intensity of a monochrome image f at coordinate (x,y) the gray
level (l) of the image at that point.
• thus, l lies in the range Lmin ≤ Lmax
Lmin is positive and Lmax is finite.
• gray scale = [Lmin, Lmax]
• common practice, shift the interval to [0, L]
• 0 = black , L = white
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 59
19ECS352: IMAGE PROCESSING
Number of bits The number of gray levels typically is an integer power
of 2
L = 2k
Number of bits required to store a digitized image
b=MxNxk
Resolution
• Resolution (how much you can see the detail of the image) depends on sampling
and gray levels.
• the bigger the sampling rate (n) and the gray scale (g), the better the approximation
of the digitized image from the original.
• the more the Sampling scale becomes, the bigger the size of the digitized image.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 60
19ECS352: IMAGE PROCESSING
Checkerboard effect
a) 1024x1024
(b) 512x512
(c) 256x256
(d) 128x128
(e) 64x64
(f) 32x32
if the resolution is decreased too much, the checkerboard
effect can occur.
False contouring
(a) Gray level = 16
(b) Gray level = 8
(c) Gray level = 4
(d) Gray level = 2
if the gray scale is not enough, the smooth area
will be affected.
False contouring can occur on the smooth area
which has fine gray scales.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 61
19ECS352: IMAGE PROCESSING
Signals and Images
• a signal is a function that carries information.
• usually content of the signal changes over some set of spatiotemporal
dimensions.
Time-Varying Signals
• Some signals vary over time:
f(t)
for example: audio signal
• may be thought at one level as a collection various tones of
differing audible frequencies that vary over time.
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 62
19ECS352: IMAGE PROCESSING
Spatially-Varying Signals
• Signals can vary over space as well.
• An image can be thought of as being a function of 2 spatial dimensions:
f(x,y)
• for monochromatic images, the value of the function is the amount of light
at that point.
• medical CAT and MRI scanners produce images that are functions of 3
spatial dimensions:
f(x,y,z)
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 63
19ECS352: IMAGE PROCESSING
Spatiotemporal Signals
What do you think a signal of this form is?
f(x,y,t)
x and y are spatial dimensions;
t is time.
Perhaps, it is a video signal, animation, or other time-varying picture
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 64
19ECS352: IMAGE PROCESSING
Basic Relationship b/w pixels
• Neighbors of a pixel
• Connectivity
• Relations, Equivalences, and Transitive Closure
• Distance Measures
• Arithmetic/Logic Operations
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 65
19ECS352: IMAGE PROCESSING
Neighbors
a pixel p at coordinate (x,y) has
N4(p) : 4-neighbors of p x
x n x
(x+1, y), (x-1,y),(x,y+1), (x,y-1) x
ND(p) : 4-diagonal neighbors of p x x
(x+1, y+1), (x+1,y-1),(x-1,y+1), (x-1,y-1) n
x x
N8(p) : 8-neighbors of p :
a combination of N4(p) and ND(p) x x x
x n x
x x x
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 66
19ECS352: IMAGE PROCESSING
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 67
19ECS352: IMAGE PROCESSING
Euclidean distance between p and q
q(s,t)
D ( p, q)= [( x −s)2+ ( y− t )2 ]1/2
p(x,y)
t-y
s-x radius (r) centered at (x,y)
City-block distance: D4 distance
D4 ( p, q) = x − s + y − t
diamond centered at (x,y)
2 D4 =1 are 4-neighbors of (x,y)
2 1 2
2 1 0 1 2
2 1 2
2
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 68
19ECS352: IMAGE PROCESSING
Chessboard distance: D8 distance
D8 ( p, q) = max ( x − s , y − t )
2 2 2 2 2
2 1 1 1 2
square centered at (x,y)
2 1 0 1 2
2 1 1 1 2
2 2 2 2 2
m-connectivity’s distance
distances of m-connectivity of the path between 2 pixels depends on values of pixels
along the path.
eg., if only connectivity of pixels valued 1 is allowed. find the m-distance b/w p and p4
p3 p4 0 1 1 1
p1 p2 1 0 1 0
p 1 1
distance = 2 distance = 3
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 69
19ECS352: IMAGE PROCESSING
Mathematical operations used in digital image processing
Image arithmetic is the implementation of standard arithmetic operations, such as
• Addition
• Subtraction
• Multiplication
• Division
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 70
19ECS352: IMAGE PROCESSING
Questions?
Department of Computer Science & Engineering
Dr. Justin Varghese Slide 71
19ECS352: IMAGE PROCESSING
Thank You
Department of Computer Science & Engineering