0% found this document useful (0 votes)
12 views

RS GIS Module1

Document on Remote Sensing

Uploaded by

Jack Stephen.G
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views

RS GIS Module1

Document on Remote Sensing

Uploaded by

Jack Stephen.G
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

REMOTE SENSING & GIS (21 Scheme) AY 2023-24

MODULE 1: REMOTE SENSING

BASIC CONCEPTS: Remote sensing (An EYE in the SKY) is the common methods used to collect data
at a distance from the object by recording devices. The use of remote sensing techniques is increasing
rapidly, finding new fields of application as technology advances in developing the remote sensing systems.
The human eye collects information from only a part of the electromagnetic radiation (visible light)
reflected from the external objects. The information collected on the retina is transmitted to the mind, which
physiologically processes these signals to form a complete picture.

i. RS is an observation tool to identify objects or measure and analyze their characteristics without
directly contacting the targets.
ii. RS utilizes electromagnetic radiations as a medium for the identification, measurement and monitoring
of the earth surface features.
iii. It is based on the fact that all matters reflects, observes, transmits and emits the EMR in a unique way
with respect to wavelength.
iv. This unique property of EMR is called as spectral characters/ signatures.
v. Camera or scanners are mounted on the satellite sensors, aircrafts and ground drones used in the data
collection process.

REMOTE SENSING: “The Art and Science of obtaining


information of earth’s object without any physical contact”. This
records the data, the energy interaction and target in electromagnetic
radiation. Human’s eye is an example of RS in the basic form, it
collects the information about various objects and the human brain
interprets it.
With growing population & raising standard of living, pressure
on natural economic resources are increasing day by day. Hence this
technique became a necessary task to manage the available resources effectively and economically. It
requires periodic preparation of accurate inventions to natural resources which may be renewable or non-
renewable that can be mapped and monitored periodically for its sustainability.

HISTORY OF REMOTE SENSING:


1839 - First photograph
1858 - First photo from a balloon
1903 - First plane
1909 - First photo from a plane
1960 – Space Satellite Sensor.

India began development of an indigenous IRS (Indian Remote Sensing Satellite)


program to support the national economy in the areas of Agriculture, Water resources,
Forestry, Ecology, Geology, Water sheds, Marine fisheries and Coastal management.
With the advent of high resolution satellites new applications in the areas of Urban
sprawl, Infrastructure planning and large scale applications for mapping have been
initiated.
Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore
REMOTE SENSING & GIS (21 Scheme) AY 2023-24

TYPES OF REMOTE SENSING


There are several broad categories of basic sensor system. Types such as
1. Passive: The sensor acquires the data using natural sunlight sources ranging less than 1 mm. E.g.,
film photography, infrared, charge coupled devices, and radiometers.
Passive remote sensing depends on a natural source to provide energy.
• For example sun is the most powerful and commonly used source of energy for passive remote
sensing.
• The satellite sensor in this case records primarily the radiation that is reflected from the target.

2. Active sensor: The Sensor provides its own energy source of illumination to record earth surface
features. Its wavelength ranging from 0.4 to 10m.
E.g., RADAR and LASER.
• Active remote sensing uses an artificial source for energy.
• For example the satellite itself can send a pulse of energy which can interact with the target.
• In active remote sensing, humans can control the nature (wavelength, power, duration) of the source
energy. Active remote sensing can be carried out during day and night and in all weather conditions

3. Imaging versus Non-imaging (Scanning sensors): Data from imaging sensors can be processed to
produce an image of an area within which smaller parts of the sensors. Whole view are resolved visually.
Non imaging sensors usually are hand held devices that register only a single responsealue with no
fine resolution Imaging and non-imaging data provides an opportunity to look at spatial relationships,
objects shapes and to estimates physical size based on the data spatial resolution and sampling.

Passive remote sensing and Active remote sensing. Passive sensors gather radiation that is emitted or reflected
by the object or surrounding areas. Reflected sunlight is the most common source of radiation measured by
passive sensors.

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


REMOTE SENSING & GIS (21 Scheme) AY 2023-24

PROCESS/ COMPONEMTS OF REMOTE SENSING SYSTEMS

A=Natural light source (Sun); B= Atmospheric elements (clouds; CO emission); C= Surface features;
D= Satellite sensor; E= Observation station; F= Disk/ floppy; G= Processed Images
1. Energy Source (A):- Sun is the main source of light energy which strikes the earth surface features
through Electro-Magnetic Radiation (EMR).
2. Radiation and Atmosphere (B):- EMR interacts with the atmospheric elements while traveling from
its source to the target. This may obstruct the accuracy of the data collection.
3. Interaction with the Target (C):- Radiation that is not absorbed or scattered in the atmosphere can
reach and interact with the Earth's surface. Different objects return different amount of energy depending
on the physical, chemical & optical properties.
4. Recording of Energy by the Sensor (D):- After the energy has been emitted from the target, the
reflected wavelength will be collected and recorded by the sensors. Error may occur during the data
collection due to difference in surface roughness, angle of incidence, intensity, and wavelength of radiant
energy.
5. Transmission, Reception, and Processing (E):- The energy recorded by the sensor will transmit the
data in electronic form to the nearest observation station. Then the detection and discrimination of earth’s
feature will be done to form final image.
6. Interpretation and Analysis (F):- Later the processed image will be interpreted visually and digitally
to extract specific information of the particular features.

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


REMOTE SENSING & GIS (21 Scheme) AY 2023-24

ELECTROMAGNETIC SPECTRUM
Electromagnetic Radiation (Waves): Electromagnetic Radiation is the combination of electric and
magnetic fields that propagate together at the speed of light. But this energy can be detectable only when
reacts with any object. This is commonly spoken of as the velocity of light, although light is only one form
of electromagnetic energy.
Speed of the light (C) = 3 x 108 m/s

Sun is a natural source of electromagnetic radiation that can travel in vaccum. An electromagnetic wave
is characterized by
i. Frequency (f): It is the number of waves that pass in
an interval of time. Hertz is the unit for a frequency of one
cycle per second.
ii. Wavelength (λ): It is the distance between successive
crest or troughs in the wave measured in standard metric
system. C=fxλ or f=C/λ

ELECTROMAGNETIC SPECTRUM

Wavelength
(mts)

1. The electromagnetic spectrum is the range of all possible frequencies of electromagnetic waves.
2. The main components of the electromagnetic spectrum are
a. Gamma-rays b. X-rays c. Ultra-violet rays d. Visible light
e. Infra-Red rays f. Microwaves g. Radio waves
3. Lowest frequencies are recorded at the one end of radio waves, while highest frequencies are
recorded on the other end of gamma rays.
4. Wavelength and frequencies are inversely proportional to each other as the frequencies increases
on the EM spectrum wavelength decreases.
5. Visible region of the spectrum lies in the spectral rays of 0.4 – 0.7 μm wavelength. The energy
reflected by the earth and other objects during the day time is recorded in this range. This is only
region which is visible to human eye.
6. Reflected Infrared radiation ranges from 0.7 – 3 μm wavelength. These are recorded by infrared
sensor systems.
7. Thermal Infrared radiation ranges from 3 – 5 μm and 8 - 14 μm representing greater intensity.
8. Micro-wave region ranges from 1 – 300 mm which can penetrate through rain, fog and clouds. Both
active and passive sensors are capable of taking images using this sensors.
9. Gamma rays ranges less than 10 pictometer.
10. X-rays ranges from 10-5 – 10-3 (0.000001 nm).
11. Radio-waves have longest wavelength greater than 106 used for Remote Sensing in some radars.

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


REMOTE SENSING & GIS (21 Scheme) AY 2023-24

WAVELENGTH REGIONS IMPORTANT TO REMOTE SENSING


1. Ultraviolet or UV: Ultraviolet (UV) light has wavelengths of approximately 1 – 380 nm. Near and
middle UV wavelengths have information about ozone, sulfur dioxide, and trace gases in the
troposphere and stratosphere (0–50 km) of interest to the atmospheric and volcanic sciences. Far and
extreme UV wavelengths have information about airglow and auroral emissions from the thermosphere
(>100 km) of interest in aeronomy and space weather. Few earth surface rocks and minerals emit visible
radiation when illuminated by UV radiation.

2. Visible Spectrums: The visible wavelengths cover a range from approximately 400 – 750 nm or 0.4
to 0.7 μm. The longest visible wavelength is red and the shortest is violet. The visible portion of the
spectrum is used extensively in remote sensing and energy is recorded using photography. Common
wavelengths of what we perceive as particular colors from the visible portion of the spectrum are listed
below. Red, Green and Blue are the primary colors in which all other colors can be formed in various
proportions. Although we see sunlight as a uniform or homogeneous color, it is actually composed of
various wavelengths of radiation in primarily the ultraviolet, visible and infrared portions of the
spectrum. The visible portion of this radiation can be shown in its component colors when sunlight is
passed through a prism, which bends the light in differing amounts according to wavelength.

3. Infrared (IR): This covers the wavelength range from approximately 0.7 μm to 100 μm (750 nm -
10,000 nm) more than 100 times as wide as the visible portion. The infrared can be divided into 3
categories as follows

Sl Categories Wavelength Applications


No range
1. Near InfraRed 0.7 μm to To expose black & white and color-infrared sensitive film
(NIR) 1.3 μm
2. Shortwave 1.3 to 3.0 Used to observe the health of vegetation, soil composition and
InfraRed (SWIR) μm moisture content.
3. Thermal InfraRed 3.0 μm to Wavelengths from 8 to 15 µm are best for studying the longwave
(TIR) 100 μm thermal energy radiating from the Earth.

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


REMOTE SENSING & GIS (21 Scheme) AY 2023-24

4. Microwave: Microwaves are essentially high frequency radio waves and have wavelengths that range
1mm to 1m. Mid-wavelength microwaves can penetrate through haze, light rain & snow, clouds, and
smoke are beneficial for satellite communication, flight movement and studying the Earth from space.
Radar technology sends pulses of microwave energy and senses the energy reflected back.

Sl Region Wavelength Remarks


No
1. Gamma ray <0.03 nm Incoming radiation is completely absorbed by the upper atmosphere
and is not available for RS
2. X-ray 0.03 to 3.0 nm Completely absorbed by atmosphere. Not employed in RS
3. Ultraviolet 0.3 to 0.4 μm Incoming wavelengths less than 0.3 μm are completely absorbed by
ozone in the upper atmosphere.
4. Photographic 0.3 to 0.4 μm Transmitted through atmosphere. Detectable with film and
UV band photodetectors, but atmospheric scattering is severe
5. Visible 0.4 to 0.7 μm Imaged with film and photodetectors. Includes reflected energy
peak of earth at 0.5 μm
6. Infrared 0.7 to 1.00 μm Interaction with matter varies with wavelength. Atmospheric
transmission windows are separated.
7. Reflected IR 0.7 to 3.0 μm Reflected solar radiation that contains information about thermal
band properties of materials. The band from 0.7 to 0.9 μm is detectable
with film and is called the photographic IR band.
8. Thermal IR 3 to 5 μm Principal atmospheric windows in the 8 to 14 μm thermal region.
Images at these wavelengths are acquired by optical mechanical
scanners and special vidicon systems but not by film. Microwave
0.1 to 30 cm longer wavelengths can penetrate clouds, fog and rain.
Images may be acquired in the active or passive mode.
9. Radar 0.1 to 30 cm Active form of microwave RS Radar images are acquired at various
wavelength bands.
10. Radio >30cm Longest wavelength portion of electromagnetic spectrum. Some
classified radars with very long wavelengths operate in this region.

BLACK BODY IN REMOTE SENSING

Black body radiation is used to develop infrared sensors and other devices that detect and measure
electromagnetic radiation. These devices can be used to monitor the Earth's surface, detect weather patterns, and
identify military targets.
Blackbody radiation is a term used to describe the relationship between an object's temperature, and the
wavelength of electromagnetic radiation it emits. A black body is an idealized object that absorbs all
electromagnetic radiation it comes in contact with.

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


REMOTE SENSING & GIS (21 Scheme) AY 2023-24
ATMOSPHERIC WINDOWS
Regions of the electromagnetic spectrum in which the atmosphere is transparent are called
atmospheric windows. These wavelengths corresponding to atmospheric window are used in Remote
Sensing to acquire good quality images. In the visible and infrared bands atmospheric windows exist in
the following regions:
Visible and Near IR - 0.4 μm to 1.3 μm
Thermal IR- 3 μm to 5.5 μm 8.5 μm to 14 μm
When EMR is transmitted from the sun to the earth surface, it passes through the atmosphere. Here,
electromagnetic radiation is scattered and absorbed by gases and dust particles. Besides the major
atmospheric gaseous components like nitrogen and oxygen, other constituents like water vapour, methane,
hydrogen and helium compounds play important role in modifying electro-magnetic radiation. This affects
image quality. Some of the commonly used atmospheric windows are shown in the figure.

Atmospheric Window in the Electromagnetic Spectrum

An atmospheric window is a region of the electromagnetic spectrum that can pass through the atmosphere of
Earth. The optical, infrared and radio windows are the three basic atmospheric windows. The Sun's
electromagnetic energy reaches the Earth through the atmospheric windows. The Earth's thermal radiation leaves
its atmosphere through the windows. These wavelength bands are known as atmospheric "windows" since
they allow the radiation to easily pass through the atmosphere to Earth's surface.

In particular, the molecules of water, carbon dioxide, oxygen, and ozone in our atmosphere block solar radiation.
The wavelength ranges in which the atmosphere is transparent are called atmospheric windows. Remote
sensing projects must be conducted in wavelengths that occur within atmospheric windows.

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


REMOTE SENSING & GIS (21 Scheme) AY 2023-24

ENERGY INTERACTIONS WITH THE ATMOSPHERE

The constituents of the atmosphere can be divided into two group’s


viz. (a) pure gases and (b) particulates. Pure gases in the atmosphere
comprise nitrogen (78 %), oxygen (21%) and traces of argon, CO, water
vapour and ozone. The particulates in the atmosphere include particles of
various sizes originating from smoke, dust and rock debris. EMR has to travel through some distances
through these constituents in the earth’s atmosphere before reaching the surface. Atmospheric gases and
particles may affect the incoming light and radiation caused by the mechanism of absorption and scattering.

1. ABSORPTION: Ozone, carbon dioxide, and water vapor are the three main atmospheric constituents
which absorb radiation. Ozone serves as protective layer all living organisms which absorb the harmful
UV radiation from the sun. Without this protective layer in the atmosphere our skin would burn when
exposed to sunlight. Carbon dioxide referred to as a greenhouse gas that tends to absorb radiation strongly
in the far infrared region and serves to trap heat inside the atmosphere. Water vapor in the atmosphere
absorbs much of the incoming long wave infrared and shortwave microwave radiation (between 1.4, 1.9
and 2.1 μm). The presence of water vapor in the lower atmosphere varies greatly from location to location
and at different times of the year.
E.g., air mass above a desert would have very little water vapor to absorb energy, while the tropics would
have high concentrations of water vapor (i.e. high humidity).

2. SCATTERING: The atmospheric elements such as gas molecules of 10-4 μm


in size and haze (water droplets) vary in size from 10-2 μm to 102 μm may affect the
frequency, intensity, spectral distribution and changes the radiation path. It reduces
the image contrast, and reflectance characteristics of ground objects as seen by the
sensor. This depends upon the relative size of atmospheric particles. There are three
(3) types of scattering which take place.

i. RAYLEIGH SCATTERING: Rayleigh


scattering occurs when very small particles such as
small specks of dust or nitrogen and oxygen
molecules by size of 10-4 μm in a clean atmosphere.
Rayleigh scattering is the dominant scattering
mechanism in the upper atmosphere and causes
shorter wavelengths. The fact that the sky appears
"blue" during the day is because of this phenomenon. At sunrise and
sunset the light has to travel farther through the atmosphere than at
midday and the scattering of the shorter wavelengths is more complete;
this leaves a greater proportion of the longer wavelengths to penetrate
the atmosphere.

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


REMOTE SENSING & GIS (21 Scheme) AY 2023-24

ii. MIE SCATTERING: It occurs when the particles are just about the same size as the wavelength of the
radiation. Dust, smoke, water vaporand other particles ranging from a few micron to several microns in
diameter are common causes which tends to affect longer wavelengths.Mie scattering occurs mostly in
the lower portions of the atmosphere where larger particles are more abundant and dominates when cloud
conditions are overcast. The amount of Mie scatter is greater than the Rayleigh scatter and wavelengths
scattered are larger.

iii. NON-SELECTIVE SCATTERING: This occurs in the other extreme case when the particle size is
very much larger than the wavelength and does not depend on the wavelength of radiation. The whitish
appearance of sky under heavy haze conditions is due to non-selective scattering. The effect of Rayleigh
component can be eliminated by using minus blue filter. This type of scattering causes fog and clouds
to appear white to our eyes because blue, green, and red light are all scattered in approximately equal
quantities (blue+green+red light = white light).
Eg: Rainbows.

SPECTRAL REFLECTANCE CURVE

The graphical representation of the spectral response of an object over different wavelengths of the
electromagnetic spectrum is termed as spectral reflectance curve. The reflectance characteristics of the
surface features are represented using these curves.

SPECTRAL SIGNATURE CONCEPTS - TYPICAL SPECTRAL REFLECTANCE


CHARACTERISTICS OF SOIL, WATER AND VEGETATION
Each earth surface object has its own spectral
characteristic manner of interacting with incident radiation
described by the spectral response of the electromagnetic
spectrum. These targets in a particular wavelength region, in
turn depends upon certain factors, namely orientation of the
sun (solar azimuth), the height of the Sun in the sky (solar
elevation angle), the direction in which the sensor is pointing
relative to nadir (the look angle) and nature of the target, that is, state of health of vegetation.

Spectral Reflectivity
➢ Reflectivity is the fraction of incident radiation
reflected by a surface.
➢ The reflectance characteristics of Earth’s
surface features may be quantified by
measuring the portion of incident energy that
is reflected.
➢ This is measured as a function of wavelength
(λ) and is called spectral reflectance (rλ).
➢ Chlorophyll strongly absorbs energy in the wavelength bands centered at about 0.45μm (blue) and 0.67
μm (red).
➢ Our eyes perceive healthy vegetation as green in color because of the very high reflection of green
light.
Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore
REMOTE SENSING & GIS (21 Scheme) AY 2023-24

i. Spectral reflectance of Soil


❖ The factors that influence soil reflectance act over less
specified spectral bands.
❖ Factors affecting soil reflectance are moisture content,
soil texture (proportion of sand, silt and clay), surface
roughness, presence of iron oxide and organic matter
content.
❖ The presence of moisture in soil will decrease its
reflectance -this effect is greatest in the water
absorption bands at about 1.4, 1.9, 2.2 and 2.7 μm.
❖ Bare soil generally has an increasing reflectance, with
greater reflectance in near-infrared and shortwave
infrared.

ii. Spectral reflectance of Water


❖ Water (in soil, vegetation or water bodies) absorbs
radiation at near-IR wavelengths and beyond (strong
absorption bands at about 1.4, 1.9 and 2.7 μm).
❖ Reflectance from a water body can stem from an
interaction with the water’s surface (specular reflection),
with material suspended in the water, or with the bottom
of the water body.
❖ Water has relatively low reflectance, withclear
water having the greatest reflectance in the blue portion
of the visible part of the spectrum.

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


REMOTE SENSING & GIS (21 Scheme) AY 2023-24

Spectral reflectance of Vegetation


❖ In the range of 0.7 to 1.3 μm a plant leaf
typically reflects 40 - 50% of the energy
incident upon it primarily due to the
internal structure of plant leaves.
❖ This helps in discriminating different
plants and trees in various fields.
❖ Many plant stresses alter the reflectance
in this region, and sensors operating in
this range are often used for vegetation
stress detection.
❖ Beyond 1.3 μm energy incident upon
vegetation is essentially absorbed or
reflected with little to no transmittance of energy.
❖ Dips in reflectance occur at 1.4, 1.9 and 2.7 μm due to strong absorption of water by leaves at these
wavelengths (water absorption bands).
❖ Reflectance peaks occur at about 1.6μm and 2.2 μm between the absorption bands throughout the range
beyond 1.3 μm, leaf reflectance is approximately inversely related to the total water present in a leaf
which is a function of both the moisture content and the thickness of a leaf.

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


[Type here]
REMOTE SENSING & GIS (21 Scheme) AY 2023-24

REMOTE SENSING PLATFORMS


A platform is a vehicle or carrier for remote sensing instruments are mounted. These are mainly
attributed for object of interest, periodicity of image acquisition, timing of image acquisition, location and
extent of coverage. There are three broad categories of remote sensing platforms.

a. Ground based: A wide variety of ground-based platforms are used in remote sensing. Some of the
common ones are hand devices, tripods, towers and cranes. To study properties of a single plant or a small
patch of grass, it would make sense to use a ground based instrument. Towers can be built on site and can
be tall enough to project through a forest canopy for the range of forest health/ thickness measurements.
Even this will be helpful in mapping the different width of the roads.
b. Airborne: Air borne platforms were sole non-ground-based platforms for early remote sensing. The first
aerial images were acquired with a camera carried aloft by a balloon in 1859. Balloons are rarely used
today because they are always not predictable. At present air planes are the most common airborne
platforms. Most suitable technique for military purpose and to reduce crime.
Helicopters are usually used for low altitude applications, but quite expensive to operate. They are limited
to flying at lower elevations and at slow speeds with less area coverage. Another class of aircraft that has
been in use for many years is remote control aircraft or drones, where it may be too hazardous to fly.
c. Space-borne: The most stable platform aloft is a satellite. The first remote sensing satellite was launched
in 1960 for meteorology purposes (weather forecasting). Satellites can be classified by their orbital
geometry and timing. Three orbits commonly used for remote sensing satellites are geostationary,
equatorial and sun synchronous. For most remote sensing satellites revisits same location once in every
16 days. In addition to sensor systems, there are often devices for recording, preprocessing and
transmitting the data.

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


[Type here]
REMOTE SENSING & GIS (21 Scheme) AY 2023-24

ADVANTAGES AND DISADVANTAGES OF RS PLATFORMS


Platforms Advantages Disadvantages
Ground • Can be used to identify the reflectance • Collect the reflectance characteristics from
based or characteristics of an individual leaf, a single point, not creating image
hand plant or area.
held • Flexible availability
camera • Useful for real-time spraying
applications
UAV • Flexible availability • Relatively unstable platform can create
• Relatively low cost blurred images
• Very high spatial resolution • Geographic distortion
• Changeable sensors • May require certification to operate
• May be limited in height above ground
• Processing the data into field images may
be prone to error
Aircraft • Relatively flexible availability • High cost
• Relatively high spatial resolution • Availability depends on weather conditions
• Changeable sensors
Satellite • Some free images • High cost for high spatial resolution images
• Clear and stable images • Clouds may hide ground features
• Large area with each image • Fixed schedule
• Good historical data • Data may not be collected at critical times
• May need to sort through many images to
obtain useful information

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


[Type here]
REMOTE SENSING & GIS (21 Scheme) AY 2023-24
SENSORS
There are several broad categories of basic sensor system. Types such as
1. Passive: The sensor acquires the data using natural sunlight sources ranging less than 1 mm. E.g.,
film photography, infrared, charge coupled devices, and radiometers.
2. Active sensor: The Sensor provides its own energy source of illumination to record earth surface features.
Its wavelength ranging from 0.4 to 10m.
E.g., RADAR and LASER.
3. Imaging versus Non-imaging (Scanning sensors): Data from imaging sensors can be processed
to produce an image of an area within which smaller parts of the sensors. Whole view are resolved
visually. Non imaging sensors usually are hand held devices that register only a single response
value with no fine resolution Imaging and non-imaging data provides an opportunity to look at
spatial relationships, objects shapes and to estimates physical size based on the data spatial
resolution and sampling.

SENSORS RESOLUTION
The resolution of remote sensed raster data can be characterized in several different ways. There are
four primary types of "resolution" such as Spatial, Spectral, Radiometric and Temporal. It is nearly
impossible to acquire imagery that has high spatial, spectral, radiometric and temporal resolution. This is
known as Resolution Trade-off, as it is difficult and expensive to obtain imagery with extremely high
resolution. Therefore it is necessary to identify which types of resolution are most important for a project.

1. Spatial Resolution: Spatial resolution is the type of resolution most people are familiar with. An image
with 30 mts spatial resolution of a single pixel represents 30 x 30 mts. It is determined by sensor characteristics
for digital imagery and film characteristics including field of view, altitude for film photography. The higher
the resolution of the image, the more expensive it is to capture, process, and distribute. This means that a
satellite that acquires daily images generally has a more coarse (larger) spatial resolution.

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


[Type here]
REMOTE SENSING & GIS (21 Scheme) AY 2023-24

2. Spectral Resolution: This refers to how


many spectral “bands” a sensor can record.
Indian Remote Sensing satellite i.e., LISS-4
has 4 spectral bands; whereas Landsat-5 has 7
bands with Black and white photos contain
only 1 band that covers the visible region. Finer
spectral resolution represents narrower
wavelength range for specific use. This helps to
reveal mineral content of rocks, the moisture of
soil, vegetation health and others.
Panchromatic (Black & White) imagery
consists of 1 band; RGB color image will have 3 bands; Multispectral has 4-6 (RGBN); Super spectral has 16
or more bands; hyperspectral has hundreds of bands and ultraspectral has more than 1000 bands recorded by
the respective sensors.

3. Radiometric resolution: Radiometric resolution is how finely a satellite or sensor divides up the radiance
it receives in each band. The greater the radiometric resolution the greater the range of
intensities of radiation the sensor is
able to distinguish and record.
Radiometric resolution is typically
expressed as the number of bits for
each band. Traditionally 8-bit data
was common in remote sensed data,
newer sensors (like Landsat 8)
have 16-bit data products. 8 bits = 28 = 256 levels (usually 0 to 255) 16 bits = 216 = 65,536 levels (0 to
65,535).

4. Temporal resolution: Remote sensed data represents a snap shot in time. Temporal resolution is the time
between two subsequent data acquisitions for an area. This is also known as the “return time” or "revisit time".
The temporal resolution depends primarily on the platform, for example, satellites usually have set return times
and while sensors mounted on aircraft or unmanned aircraft systems (UAS), have variable return times. For
satellites, the return time depends on the orbital characteristics (low vs high orbit), the swath width and whether
or not there is an ability to point the sensor. Landsat has a return time of approximately 16 days, while other
sensors like MODIS have nearly daily return time.

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


[Type here]
REMOTE SENSING & GIS (21 Scheme) AY 2023-24

TYPES OF SATELLITES – Indian and other remote sensing satellites


Indian Satellite began to develop indigenous Indian RS satellite programs to support the national
economy in the field of agriculture, water resource, watershed, forestry, ecology, geology & coastal
management etc. The IRS system is the largest constellation of RS satellite for civilian which is in operation
till today. With almost 15-17 satellites which are placed in Polar sun synchronize orbit.

1. Indian Regional Navigation Satellite System (IRNSS)-1B: This is the first navigation system for the
users which has been developed by India. The launch date is 4th April 2014.
Applications
i. Information about timings, disaster, vehicle traffic and flights.
ii. Navigation for hikers, travelers & marines.
iii. Smart phone integrated maps along with audio & video for the drivers.

2. MANGALYAN (Mars Orbitor): it was launched on 5th Nov 2013.


Applications: Observation of mars terrain, seeking the availability of water & life providing
information about the atmosphere.

3. INSAT: ISRO has allotted nearly seven Ku band transponders to Sun Direct, a DTH service provider
from South India, and the other five to Doordarshan’s DD Direct Plus. 12 transponders in the C band are
for TV, radio and telecommunication purposes. Ku stands for Kurz Unten- German for the band just
underneath the short of K-band which starts at 20 GHz. It is one of the major technologies used today for
high-speed satellite internet and to broadcast satellite television direct line to houses.

4. SARAL (Joint Venture with India & France): Satellite launched on 25th Feb 2013.
Applications:
i. Detailed information about ocean studies performing altimetric activities for the water circular in the
ocean.
ii. To measure the activities regarding the ocean and sea surface elevations.

5. RISAT-1 (Radar Imaging Satellite): It is one of the Radar satellites launched on 26th April 2014.
Applications:
i. To improve images of the earth surface.
ii. To improve the working of RS under all weather conditions.

6. GSAT: GSAT-3 or EDUSAT is a communication satellite which was launched on 20th sept 2004 by ISRO.
EDUSAT is the first Indian Satellite built exclusively to serve the educational sector such as interactive
satellite-based distance education system for the country.
7. SRM SAT: This serves for Govt. purposes to know about global warming. Nano satellite launched on 12th
Oct 2011. Pollution monitoring and its level of CO2 and water vapor, absorption of spectrum ranges.

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


[Type here]
REMOTE SENSING & GIS (21 Scheme) AY 2023-24

8. MEGHATROPIQUES: An Indo-French joint venture satellite launched on 12th Oct 2011. It is a satellite
mission to study the water cycle in the tropical atmosphere in the context of climate change and tropical
weather systems.
Applications: To study the phenomenon of two nations, water & energy supplied of tropical regions, deep
study of properties which affects tropical weather.

9. OCEAN SAT-1: An Ocean monitor satellite launched on 23rd Sept 2009.This will help to know more
about oceans, their wealth and coastal ecosystems. This satellite has two main sensors, namely Ocean
Color Monitor (OCM) and Multi Frequency Scanning Microwave Radiometer (MSMR). The potential
scientific investigations such as ocean primary productivity, fish stock assessment, sediments estimation,
optical property of water, regional and global climate and environmental changes, bio-geochemical cycle
and others.

10. CARTOSAT-1: The satellite launched on 5th May 2015 and the data was provided by the satellite since
8th of May 2005. It orbits 14 times per day with two PAN (Panchromatic) sensors. The imageries have a
spatial resolution of 2.5 mts and cover a swath of 30 km with an orbital height of 618 km. This sensor
weighs 1560 kg at launch.
Applications: Urban management, Mineral management & Disaster management. To design & develop stereo
images.

11. CARTOSAT-2: Planned to launch on 23rd June 2017 with 30 co-passenger satellites at Sri Harikota. The
swath covered is 9.6 km and their spatial resolution is less than 1 mts. Weighing around 680 kg at launch
its application is towards cartography in India. Currently, India buys images worth about Rs. 2 crore a year
from Ikonos with 80cm spatial resolution. Cartosat-2 can produce images upto 80cm in resolution and
buying images from Ikonos is likely to decline in future.

12. IRS Series: The principal aim is for survey and management of agriculture, geology, forest and hydrology
and other natural resources. IRS series of satellites are IRS-1A, IRS-1B, IRS-1C, IRS-1D and IRS-P4
apart from other satellites which were launched by Govt. of India. The IRS-1A to IRS- 1D are mainly
focused on land surface; whereas IRS-P4 is an oceanographic satellite. First series in Remote Sensing. It
is the polar sun synchronous which was launched at Vastok in USSR, launched on 17th March 1988.
Orbital period 103 minutes. Inclination angle of the Satellite is 990. The sensors used LISS-I & LISS-II
linear images & scanners that detects solid materials. The spatial resolution of LISS-I is 72.5 mts, LISS-
II is 36.25 mts, LISS-III is 23.5 mts, whereas LISS-IV is 5.8 mts.

13. CHANDRAYAAN-1: It was the India’s first unmanned lunar probe launched by ISRO in Oct 2008 and
operated until Aug 2009. The mission was a major boost to India’s space program, as India researched
and developed its own technology in order to explore the Moon and making India the fourth country to
place its flag on the Moon. The estimated cost for the project was 3.86 billion Indian rupees. This
satellite had a mass of 1,380 kg at launch and 675 kg in lunar orbit. Over a 2 year period, it was intended
to complete map of its chemical characteristics, 3D topography and presence of lunar water ice.

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


[Type here]
REMOTE SENSING & GIS (21 Scheme) AY 2023-24

PRINCIPLES/ELEMENTS OF VISUAL INTERPRETATION TECHNIQUES


A systematic study of aerial photographs and satellite imageries usually, involves several characteristics of
features shown on an image and it depend upon field of application. Most of the application consider the
following basic characteristics or variation in them which aid the visual interpretation process of satellites
imagery.
4. Tone:- Tone refers to the relative brightness or color of objects in an image. Each sensor records a
specific color or range of colors that reveal particular features. Variations or shades of these tone/ colors
provide better interpretation techniques. This becomes fundamental element to distinguishing between
similar features.
5. Shape:- Shape refers to the structure or outline boundary of individual objects. Shape can be a very
distinctive clue for interpretation. Distinct patterns appears due to human activities such as roads, railway
tracks, power lines (line or linear), urban, agricultural lands (rectangular), forest edges etc. The shape of
forest boundaries are more irregular in shape, except where man has created a road are clear cuts. Coconut
plantation shows regular shape with shadow due to its height.
6. Size:- Size of the objects in an image is a function of scale. Recognition of familiar objects allows
size estimation of other features; size is an important aspect of association. It is important to assess the
size of a target relative to other objects in a scene. A quick approximation of target size can direct
interpretation to an appropriate result more quickly. An area with a number of large buildings such as
factories or warehouses would suggest commercial property, whereas small buildings would indicate
residential use. Size and shape information greatly influenced by image resolution.
7. Pattern:- refers to the spatial arrangement of objects of discrimination. Typically an orderly repetitive
object is a recognizable pattern. Urban streets with regularly spaced houses and agricultural activity with
rectangular shapes are good examples of pattern.
8. Texture:- Areas of an image with varying degrees of ‘smoothness’ or ‘roughness’. Water appears as
smooth while forest canopy results in a rough textured appearance. Texture is one of the most important
elements for distinguishing features in radar imagery.
9. Shadow:- Shadow effects change throughout the day and throughout the year. It helps to interpret
relative heights and size of an object. Shadow is also useful for enhancing topography and landforms,
particularly in radar imagery.
10. Association:- It identifies the relationship between 2 or more recognizable objects. Commercial
properties may be associated with major transportation routes, whereas residential areas would be
associated with schools, hospitals, parks and sports fields. Agricultural and irrigation activities will be
associated with nearby perennial rivers and streams.
11. Site:- It refers to the characteristics of an object such as topography, soil, vegetation and cultural
features. Physical position of object feature (topographic or geographic), hill slopes, ocean, land, mountains
etc. The extensive transportation network also a good key to identify the port.

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


[Type here]
REMOTE SENSING & GIS (21 Scheme) AY 2023-24

1. Race track
2. River
3. Roads
4. Bridges
5. Residential area
6. Dam

1. Race track : its characteristic shape

2. River : contrasting tone and shape


3. Roads : bright tone and linear feature

4. Bridges : association with river; they cross it !


5. Residential area : the pattern that they make in conjunction with roads

6. Dam : bright tone with dark river, shape and association with river – where else would a dam be?!

Dhanashree N Nerlikar, Dept of Civil Engg. VVIT Bangalore


[Type here]

You might also like