0% found this document useful (0 votes)
5 views

Lesson 1 Basics of Remote Sensing

Remote sensing notes

Uploaded by

SERENE
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Lesson 1 Basics of Remote Sensing

Remote sensing notes

Uploaded by

SERENE
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 77

REMOTE SENSING

INTRODUCTION
INTRODUCTION
• Science & art of obtaining / capturing data/ information about an object, area or phenomenon.
• Through the analysis of the acquired data- using a device that is not in contact with the object, area,
phenomenon under investigation
 E.g. Eyes act as sensors,
 the mind becomes the device for processing that data,
 your hands implements the data

• RS depends on electromagnetic energy (EME) sensors- operated from airborne and space-borne
platforms
 RS is very useful in inventorying, mapping, monitoring earth resources, planning and
management of such resources
• The EME sensors acquire data on how various earth surface features emit and reflect electromagnetic
energy
 The data is then analyzed to provide information about the resources under investigation
• The basic processes involved in RS are:
i. Data acquisition
ii. Data analysis
INTRODUCTION CONT’D..
The basic processes of RS
Why study Remote Sensing?
Remote sensing provides:
 access to timely,

 reliable,

 detailed and

 affordable geospatial data

that enables objective and efficient decision making on the exploitation,


planning, development, monitoring and conservation/ maintenance of resources.
BASIC PRINCIPLES UNDERLYING THE REMOTE SENSING PROCESS
They include:

a. Fundamentals of electromagnetic energy


b. Consideration of how the energy interacts with
the atmosphere and with earth surface features
c. Role that reference data play in the data analysis
procedure
d. Description of how the spatial location of
reference data observed in the field is
determined using global positioning system
(GPS) methods
FROM THE BASICS WE SHOULD BE ABLE TO:
a. Conceptualize an ideal remote sensing system
b. Consider the limitations encountered in real remote
sensing systems
c. Discuss the significance of GIS technology
d. Students should also grasp the general concepts and
foundations of RS, GPS methods & GIS operations
Components of a remote sensing system

•Radiation source or source of energy

•Atmosphere

•Target or object under investigation

•Airborne or spaceborne sensor

•Data acquisition (ground segment)

•Processing and interpretation of acquired data


Main application of remote sensing
Data is acquired to yield information for use in planning and management of recourses,
….. engineering services etc.. forecasting and monitoring are other uses of R.S
Application.
Example on spatial data need:

1. Forecasting agricultural yields by agronomists (Agricultural)


Data:Area under crop + Biomas reduction per unit area.

2. (Urban Planning): Detecting illegal structures devt.


Data: Houses types + configuration + Socio economic life +Attribute data.
Application cont………..
3. (Engineering) : Optimal configuration and sitting e.g. telecommunication
facilities.
Data: Terrain,location of obstacles + facility capacity attribute data.
4. (Climatology) Centres of El-nino phenomena.
Data: Climatic parameters e.g. current, sea surface temperature, sea levels
energy
interaction between land and water etc
The Remote Sensing Process
The two basic processes involved in Remote Sensing of the earth are Data
acquisition and Data processing.
ENERGY SOURCES AND RADIATION
PRINCIPLES
Electromagnetic Energy (EME)
R.S relies on E.M.E measurements whose main sources is the Sun (Light, heat, ultraviolent
rays)
The R.S sensor detects and measures emitted or reflected energy from the earth surface
depicting the unique characteristics of the object.

Passive sensor
Sun Active sensor
er ted

d
y
Emitte
energ
en flec
gy
Re

Sea

Earth’s Surface
Sources of EME
 All matter with temperature above absolute zero Kelvin radiate EME due to molecules
agitation.
 Agitation is the movement of the molecules
 The Sun and the earth radiate EME in form of waves
 Matter that is capable of absorbing and re-emitting all EM is known as a Black Body
 For Black Bodies both the emission € and absorption (α) are max = 1.
 Amount of EME radiated depends on body's absolute temperature, emission (E) defines
by the Stefan-Boltman’s law .
W=бT4
Where =Total radiant emittance б=Constant
Watts/metre squared T=Absolute Temperature of emitting
Photon Model
 EME is composed of discrete units called photons
 Photons are used when quantifying amount of energy measured by multispectral sensors
 Amount of energy held by a photons of a specific wavelength is given by:
Q= hf where h =Planks Constant (6.6262x10-34)
f=frequency
Q=hC / λ
-The high the frequency (f) the higher the energy.
-The longer the wavelength (λ) the lower the energy.
-Hence sensors are designed to detect and measure shorter wavelength (more energy)
NB: The model is adopted when one wishes to quantify the energy measured by a sensor.
Characteristics of E.M.E

1) Wavelength λ
 Distance between successive wave crests
 Wavelength is measured in meters, micrometers or nanometer
2) Frequency f
 Number of cycles of a wave passing a fixed point over a specified period of time
 Frequency is normally measured in Heltz (Hz) equivalent to 1 cycle per second
 Since the speed of light is constant, wavelength and frequency are inversely related to
each other. C= f λ.
The shorter the wavelength, the higher the frequency and vice versa
Wave Model of E.M.E:
The EME propagate through space in form of sine waves characterized by electrical and
magnetic fields to each other and both fields vibrate to direction of wave travel.
Electromagnetic Wave Components

The components include:


a. Sinusoidal electric wave (E)
b. Similar magnetic wave (M) at right angles, both
perpendicular to the direction of propagation
Electromagnetic Spectrum
Electromagnetic Spectrum
The total range of wavelength is Gamma Radio
rays wave

Optimal range

10-5 10-1 10-0 101 105 105


wavelength
Cosmic rays

Gamma rays

Ultraviolent rays
X-rays

Visible rays

Radio waves
Microwave

Television
Infrared
0.4 0.5 0.6 0.7𝜇
( m)
Ultra
Blue Green Red Near
violent
infrared
(Primary colors)
Remote sensing operates in the optical ranges
(X-rays Visible Past Far infra red )
(light)

The longer wavelength in thermal infrared and microwave regions is also used in R.S
mineral and vegetation mapping. Microwave region is particularly used in surface
long wave detection e.g. water constants.
Cont.……………………
i.e The spectral emittance increases exponentially with temperature
Spectral radiant Emittance (W)

Higher temperature

Middle temperature

Lowest temperature

Area under the Curve represent


W and increases with T exponen
. Shorter λ higher W

Wavelength
.

ENERGY INTERACTION IN THE ATMOSPHERE


ENERGY INTERACTION IN THE
ATMOSPHERE
• Irrespective of source, all radiation (EME) detected by remote sensors passes through the
atmosphere
• The effect of the atmosphere varies with
a) Differences in path lengths( aerial photography, airborne satellite sensors, etc)
b) Magnitude (strength) of the energy signal being sensed
c) Atmospheric conditions present(vapour, etc)
d) Wavelength involved
• The atmosphere can therefore have profound effects on:
a) Intensity of radiation
b) Spectral composition of radiation available to any sensing system
• The effects are carried mainly through the mechanism of atmospheric scattering and absorption.
Sun sensor
inc
i de

d
nt

e
ct
fle
clouds

Re
Atmospheric
Internal Absorption and emission
emission

Earth’s Surface
scattered
SCATTERING
• This is the unpredictable diffusion of radiation by particles in the atmosphere. There are
mainly 3 types of scattering:
a) Rayleigh
b) MIE
c) Non- selective scattering
Rayleigh scattering
• This is common when radiation interacts with atmospheric molecules and other tiny
particles that are much smaller in diameter than the wavelength of the interacting
radiation.

Size of
particles
Particle<
radiation
Rayleigh scattering cont’d…
• The effects of Rayleigh scatter is inversely proportional to the fourth
power of the wavelenghs scatter
Note: The smaller the wavelength, more the scatter i.e radiation with short
wavelengths cause more scatter and vice versa
• There is much tendency for short wavelength to be scattered by smaller
particles than longer wavelength radiation
WHY A BLUE SKY
• A blue sky is a manifestation of Rayleigh scatter
• In the absence of Rayleigh scatter, the sky would appear black
• However, as sunlight interacts with the earths atmosphere, it scatters the shorter (blue) wavelengths
more dominantly than the other visible wavelengths .
• Consequently, we see a blue sky
• However, at sunrise and sunset , the sun’s ray travel through a longer atmospheric path length than
during the day. SUN

SUNRISE SUNS
EARTH ET
WHY A BLUE SKY
• With longer paths, the scatter (and the absorption) of short wavelengths is so
complete that we see only the less scattered- that is the longer wavelengths of orange
and red
• Rayleigh scattering is one of the primary cause of haze in imagery
• Haze diminishes the contrast of an image
• Filters are used in front of cameras in aerial photography to eliminate or minimize
haze
MIE SCATTERING
• Exists when atmospheric particle diameters are essentially equal the wavelength of the
incident energy being sensed

• Water vapour and dust are the major causes of MIE scatter
• The scatter tends to influence longer wavelengths compared to Rayleigh scatter
NON-SELECTIVE SCATTER
• This is a more troublesome scatter
• It occurs when the diameters of the particles causing scatter are much larger than the
wavelengths of the energy being sensed
• Water droplets cause such scatter
Size of particles

Inadequate energy

• Water droplets have average diameter in the range of 5 to 100 micrometer


• The particles scatter all visible and near to Mid-IR wavelengths uniformly
• Consequently, scattering is non-selective with respect to wavelength
• In the visible wavelengths equal quantities of blue, green and red light are scattered
• Hence fog and clouds appear white
ABSORPTION
• In contrast to scattering, atmospheric absorption results in effective loss of energy to the
atmosphere
• It involves absorption of energy at a given wavelength
• Most efficient absorbers of solar radiation are water, vapor, carbon dioxide and ozone
• These gases tens to absorb EME in specific wavelength banch
• Hence they influence its spectral window in any given remote sensing system
• The atmospheric window is the wavelength ranges in which the atmosphere transmits the
EME
• Interaction and the interdependence between the primary sources of EME, the
atmospheric window through which source energy may be transmitted to and from earth
surface features and the spectral sensitivity of the sensors available to detect and record
EME.
FACTORS TO CONSIDER WHEN SELECTING
THE SENSOR FOR REMOTE SENSING

1. Spectral sensitivity of the available sensors


2. Presence or absence of atmospheric windows in the spectral range(s) in
which one wish to sense.
3. Source, magnitude and spectral composition of the energy available in
these ranges
4. Manner in which the energy interacts with the features under
investigation.
This brings us to another section on “energy interaction with earth surface
features.”
ENERGY INTERACTIONS IN THE EARTH
SURFACE
ENERGY INTERACTION IN THE EARTHS SURFACE

ER (λ) =EI (λ) - [EA (λ) + ET (λ)]

EA (λ)

ET (λ)

• Reflection occurs when radiation bounces off the target this ER tells us about surface
characteristics and land bodies.
• Transmission occurs when radiation passes through a target.
• Absorption occurs when radiation is absorbed by the target.
REFLECTION
Specular Reflection
On smooth surfaces, when almost all incident energy is reflected away from the
surface in a single direction.
Occurs when the sun is high on water or glass surface A bright spot on the image
results.
Diffused Reflection
Reflection almost uniform in all direction when the surface is rough.
Specular and diffused reflections depend on the surface roughness in comparison to
wavelength of incident radiation
ELEMENTS OF DATA ACQUISITION PROCESS
ELEMENTS OF DATA ACQUISITION
PROCESS
• Energy sources
i. Propagation of energy through the atmosphere
ii. Energy interaction with earths surface features
iii. Re-transmission of energy through the atmosphere
iv. Airborne and / or space borne sensors resulting in the
generation of sensor data in pictorial and / or digital
form
NB: we use sensors to record variations in the way earth
surface features reflect and emit electromagnetic energy
ELEMENTS OF DATA ACQUISITION
PROCESS CONT’D…
• Information is compiled in form of
 Hard copy maps and tables or as
 Computer files that can be merged with other layers of
information in a GIS
• Finally information is presented to users who apply it to
decision making process
Spatial Data Acquisition
a) Ground methods:
Measurements takes place in the real field observation through land survey field techniques
e.g. Chain survey, traversing, tacheometry, levelling, etc.

Observation
Real world And Spatial
measurements database
b) Remote Sensing Methods:
Based on one of image data acquired by a sensor (active or passive) on a
platform (airborne or space borne).
Examples of sensors - Aerial camera
- Scanner
-Radar
NB: Measurement and analysis performed on image data

Observation
Real world Image Spatial
And
data database
sensor measurements
ELEMENTS OF DATA ANALYSIS
ELEMENTS OF DATA ANALYSIS

• Involves examination of data using various viewing and interpretation


devices to analyze pictorial data and/ or a computer to analyse digital
sensor data. NB: autocad, arcgis, etc. Software can be used.
• Reference data about the resources being studied are used when and
where available to assist in the data analysis e.g soil maps, vegetation
maps, crop statistics or field check data
• With the aid of the reference data, the analyst extracts information about
the type, extent, location and condition of various resources collected by
the sensor.
SENSORS AND PLATFORMS
Active and Passive R.S
Passive R.S
(Employs natural sources of energy e.g. the Sun)
Passive sensor systems is based on reflected sun’s energy work during the day.
However these then measures longer wavelengths of Earth’s emitted energy can
operate any time.
Active R.S
Uses own source of energy. The sensor emit a controlled beam of energy to Earths
surface and measures reflected energy.
SPECTRAL REFLECTIVE CURVES
• Implies that for different materials can be plotted on lab based reflection measurements
using spectrometer.
Sensors and platforms
Principle of imaging sensors
a) Analogue recording: uses aircrafts and monitored cameras.
b) Digital recording systems:
i. Whisk broom scanners – records a
Small area at a time: grid cell by grid cell.
Configured to ensure no gaps or overlaps are made.
i. Push broom scanners – Scans a whole area at a time: line array defectors
.
CONSIDERATIONS IN IMAGE ACQUISITION
• Information needed
• Time component
• Budgetary factors

SENSORS
• Devices that record EME (passive sensors and active sensors)
• Airborne and aircrafts modified specifically to carry sensors space borne- satellites
carrying the sensors (150-3600km)
ORBIT CHARACTERISTICS

 Attitude: - the distance from satellite to earths image.

 Inclination angle: the degrees between the orbit and the equator; determines
the latitudes that can be observed.

 Period: time required to complete one orbit.

 Repeat cycle: time in days between two successive identical orbits.


VISUAL IMAGE INTERPRETATION
IMAGE DATA CHARACTERISTICS

Images may be, described or characterized by four fundamental


properties:
 Scale
 Brightness
 Contrast
 Resolution
1. Scale

• Scale is the ratio of the distance between any two points


in an image or a map to the corresponding distance on
the ground.
• The scale of an image is determined by:

 The angle of field of view of the sensor


 The altitude of the platform
 The magnification factor employed in reproducing
the image
Scale cont’d…..
2. Brightness
Subjective measure, which can only be determined approximately and characterized
by terms such as light, intermediate or dark. It is the magnitude of the response
produced in the eye by light being reflected, scattered or emitted by a scene.

3. Luminance
Quantitative measure of the intensity of light coming from a source and is measured
using a photometer or light meter.

4. Contrast
Ratio of the brightest part of an image to the darkest part of an image.

5. Resolution:
 Spatial resolution
 Radiometric resolution
 Spectral Resolution
 Temporal resolution
Spatial Resolution
• This refers to the fineness of detail visible in an image

Radiometric Resolution
• Radiometric resolution refers to how sensitive a sensor is to energy
differences or brightness levels it detects. The greater the sensitivity, the
finer or the higher, the radiometric resolution is said to be. A high
radiometric resolution will aid in the discrimination of features that have
near similar spectral response

Spectral Resolution
• Spectral resolution refers to the number and width of spectral bands
detectable by a Remote Sensing instrument. Sensors are designed to
detect energy within selected range of wavelengths referred to as spectral
bands or channels.
IMAGE PREPROCESSING

.
Cosmetic
Correction
Radiometric
Correction
Atmospheric
Preprocessing
correction
Geometric
Correction
.

IMAGE ENHANCEMENT

a) RADIOMETRIC CORRECTIONS
• Cosmetric:- remove visible error and noise in image data.
• Atmospheric needed one to attenuations of reflected or omitted radiations through the absorptions or scattering
in the atmosphere.
• This includes haze; sunlight angle i.E skylight corrections.

B) GEOMETRIC CORRECTIONS
This is important when
• Using an image for 2 or 3D t co-ordinate information.
• Merging in combining different image data
• Visualizing image data in a GIS environment.
IMAGE ENHANCEMENT (COLOR PERCEPTION)

• Takes place in human eyes; associated with part of brain.


• Concern is to identify and discriminate the real world.
• Color perception theory is applied in color photography, tv screens and
printing.
• Tri-stimuli model; this is the model that explains the color theory and
states that there are three different types of dots that are necessary in
description of colors.
3-D COLOR SPACE
1. RGB color space
• Based on the additive (red; green and blue) principle of color.
• Green dots + red dots = yellow dots
2. Intensity hue saturation color space(ihs)
• Related to intuitive perception of colors. The name given to color saturation.
• Gets sharper on dullness.
3. Yellow magnetic cyan color space (YMC)
• Based on subtractive principle of color.
• Y filter subtracts blue and red and blue remains
• M filter subtracts green and red and blue remains.
• C filter subtracts red and blue and green remains.
• The color scheme is used for color definition on hard copy. I.E photographic films, papers and printed
pictures.
IMAGE INTERPRETATION
VISUAL IMAGE INTERPRETATION
• Involves extracting information from images.
• Visual analysis in interpretation; semi automatic or automatic digital.
• Uses human ability to relate colors; pattern and relate it to reality.
• Human vision is more than color perception. It deals with ability to draw conclusion
from visual observation.
• Direct and spontaneous recognition: the ability to interpret and identify object at the
first glance.
• Logical inference: makes use of clues to draw conclusion through a reasoning.
INTERPRETATION ELEMENTS USED TO IDENTIFY IMAGE CHARACTERISTICS

1)tone and hue


• Hue reflects to colors on the image while tones are related to light reflected from a surface.
• Rocks, vegetation, soils will hence differentiate tonal variation.
2) shape or form
Characterizes terrain objects: different objects have different forms.
3) size of objects
Considered absolute relative dense for e.G. Estimate the width of a road in relation to size.
4) pattern

Spatial arrangements of objects and implies the characteristics representation of certain of forms
of relationships
5)texture
Relates to frequency of tonal or texture changes and classified as either coarse or fine; smooth or
INTERPRETATION ELEMENTS USED TO IDENTIFY
IMAGE CHARACTERISTICS CONT….
6)site
• Topographical or geographical location of objects. Likely to see a swamp in a flood
plains rather than in town centers.

7)association
• A combination of objects that makes it possible it possible to infer its means or
functions. An object related to neighborhood.
• The combination of many of these simultaneously adds the degree of accuracy.
DATA VIEWS
1) image space
It enables comparison with the original image.
2) spectral space
It compares the response versus the wavelength; it indicates that each feature has its
own peculiar curve.
3) feature space
The value of two ban`ds can be regarded as components of 2D vectors.
• You are able to create a 2d space for more than 3-bands . you have to plot 2-bands at a
time.
• You can create a feature space for every object.
• Each object occupy a particular place in the feature space. This allows for classification
.
2) fazzle image classification
It assumes no clear boundary and assumes set theory partial membership is
applied.
3) knowledge based (object based/expert based)
It makes use of extra information for instance collateral information from GIS
maps; photographs to interpreted useful where a lot of details exists.
Statistical classification: this classification is purely based on digital numbers.

Erdas imagery 8.3 modes


Views data presentation; interpreter (enhancement; topographic, GIS analysis),
catalogue, classification, models, vector, Radar, virtual GIS (animations and 3D
models) and other extensions exists.
MAIN APPROACHES TO IMAGE INTERPRETATION
1) statistical approach
• Supervised or unsupervised.
• Spectrally oriented
• Relies purely on digital numbers
2)fuzzy image classification
• Entails clear boundary and uses the set theory; partial membership applied.
3)object (knowledge) based interpretation
• Makes use of extra information to interpret expert know how i.E experience useful.
• This approach is very useful where an image has a lot of defects.
IMAGE CLASSIFICATION
OVERALL OBJECTIVE
• To automatically categorize all pixels in an image into land cover classes/theories.

• Normally multi-spectral data are used.

• Spectral pattern is used as the numerical basis in categorization.

• Different feature types manifest different combination of digital numbers (dns)


based o different spectral reflectiveness and emitance properties
PROCEDURE OF IMAGE CLASSIFICATION.
1. Select and prepare your image based on purpose, resolution, date of acquisition and
wavelength,
2. Define your clusters or objects in the feature space and train your data.
3. Select classification algorithms.
4. Execute the actual classification.
5. Validate your results.

N.B
Additional data is used plus ground truthing.
To classify or assign a pixel to a certain class is based on where it falls in the feature space.
SPECTRAL PATTERN RECOGNITION

Definition
• This refers to the family of classification procedures; their identities;
pixel by pixel spectral information as the basis of automated land cover
classification.
SPATIAL PATTERN RECOGNITION

• The categorization of image pixel on the basis of their spatial relationship with
pixels surrounding them.
• Entails the aspects of image texture; feature size; pixel proximity; shape;
directionality; repetition and context may be considered.
• This classification replicates the spatial synthesis done by the human analysts in
visual interpretation process.
• They tend to be much more complex and computationally intensive than spectral
pattern recognition procedures.
TEMPORAL PATTERN RECOGNITION.
• It uses time as an aid in feature identification.
• For instance; in agricultural crop surveys distinct spectral and spatial changes
during a growing season can permit discrimination on multidate imagery that
would be impossible given any single date.
• As with image restoration; enhancement technique and image classifiers may be
used in combination of hybrid mode.
The particular approach adopted depends upon:-
• The nature of data being analyzed.
• The computational resources available.
• The intended application of classified data.
SPATIALLY ORIENTED CLASSIFICATION PROCEDURES
• Currently forms the backbone of most multispectral classification activities.
Supervised classification
• “supervises” the pixel categorization process by specifying to the computer
algorithm, numerical descriptors of the various land cover types present in a
scene.
• To do this, representative sample sites of known cover types(training areas)are
used to compile a numerical “interpretation key” that describes the spectral
attributes for each feature type of interests.
• Each pixel in the data set is compared numerically to each other category in the
interpretation key.
• It is labeled with the name of the category it “looks most alike”
PROCEDURE IN TYPICAL CLASSIFICATION PROCEDURES
1) Training stage

• The analysts identifies representative training areas and develops a numerical


description of spectral attributes of each land cover type of interests in the scene.
2) classification stage
Pixels in the data set is categorized into the land cover class it mostly resembles.
3) output stage
• After the entire data set has been categorized the results are presented in digital
form.
• Products include thematic maps; table of scenes; digital data file amendable to
inclusion in a gis.
UNSUPERVISED CLASSIFICATION
• They do not utilize training data as basis of classification.
• This family of classifiers involves algorithms they examine the unknown pixels in an
image and aggregate them into a number of classes based on the natural clusters
present based on image values.
• The basic premise is the values within a given cover type should be close together
in a measurement space.
• Data in different classes should be comparatively well separated.
• Unsupervised classification yields spectral classes whose identity is not initially
known.
• Analysis must compare the classified data with some reference data(e.G a larger
scale imagery or map).This will ensure determination of their identity and
informational value.
DIFFERENCE BETWEEN SUPERVISED AND UNSUPERVISED APPROACH

SUPERVISED UNSUPERVISED
APPROACH APPROACH
It defines useful information It determines spectrally separatable
categories then examines their classes and then defines their
spectral separatability. information intensity
HYBRID CLASSIFICATION
• Developed to tithes streamline or improve accuracy or purely supervised or unsupervised
procedures.
• Unsupervised training areas might be delineated in an image in order to aid the analyst in
identifying the numerous sectoral classers that need to be defined in order to adequately
represent the land cover information classes to be differentiated in a supervised
classification.
STRATEGIES/ CLASSES/TYPES OF CLASSIFICATION
1)Minimum-distance-to- means-classification.

Advantage: mathematically simple and efficient.


Disadvantage: insensitive to different degrees of variance in spectral response data.
2)parallel- piped classifies: it is fast and efficient contemporarily.
THANK
THANK YOU
YOU

You might also like