unit_1_gis
unit_1_gis
Introduction
Remote sensing is defined as the science and art of obtaining information about an
object, area, or phenomenon through the analysis of data acquired by sensors
without physical contact.
More precisely:
For example:
1. Environmental Monitoring:
o Detecting deforestation and desertification.
o Monitoring air and water pollution.
2. Agriculture:
o Precision farming by analyzing soil moisture, crop health, and nutrient
status.
3. Disaster Management:
o Identifying affected areas during floods, cyclones, or wildfires.
o Assessing damage and planning recovery operations.
4. Urban Planning:
o Mapping urban sprawl and infrastructure development.
o Assessing land-use patterns.
5. Climate Studies:
o Monitoring global warming by tracking sea-level rise and ice sheet
changes.
6. Military and Defense:
o Conducting reconnaissance and surveillance.
The remote sensing process involves a series of systematic stages that facilitate
data collection, transmission, and interpretation. These stages ensure the accurate
acquisition and analysis of information from Earth's surface.
A. Emission of electromagnetic radiation
C. Interaction of EMR with the object and subsequent reflection and emission
Energy is essential for remote sensing to detect, record, and analyze target
characteristics. The type of energy used determines whether the remote sensing
system is passive or active.
Scattering:
o Rayleigh Scattering: By small particles (e.g., molecules); dominant in
the blue sky phenomenon.
o Mie Scattering: By larger particles like dust or smoke.
o Non-selective Scattering: By very large particles like water droplets
(causing white clouds).
Absorption:
o Gases like ozone, water vapor, and carbon dioxide absorb energy,
creating specific atmospheric "windows" that allow certain
wavelengths to pass through.
Types of Interactions
1. Reflection:
o Specular Reflection: Energy reflects in a single direction (e.g., calm
water or a smooth surface).
o Diffuse Reflection: Energy scatters in multiple directions (e.g., rough
surfaces like soil or vegetation).
o The amount of reflected energy depends on:
Material properties of the target (e.g., vegetation, water,
buildings).
Wavelength of the incoming radiation.
Angle of incidence.
2. Absorption:
o Some materials absorb specific wavelengths of EMR, converting it
into heat or storing it internally.
o For example:
Vegetation absorbs red and blue light for photosynthesis.
Asphalt absorbs most visible and infrared light, heating up
rapidly.
3. Transmission:
o Certain materials allow EMR to pass through (e.g., glass or water).
o The degree of transmission varies based on material thickness and
composition.
Spectral Signatures
After interacting with the target, the reflected or emitted energy is captured by
sensors mounted on remote sensing platforms.
Types of Sensors
1. Photographic Sensors:
o Capture visible light using traditional film or digital cameras.
o Often used for aerial photography.
2. Non-Photographic Sensors:
o Capture energy outside the visible spectrum, such as infrared,
microwave, or ultraviolet.
o Examples:
Multispectral Sensors: Record data in multiple wavelengths
simultaneously.
Thermal Sensors: Detect heat emissions (used for temperature
mapping).
Radar Sensors: Use microwaves to capture surface features,
even through clouds.
Sensor Placement
Recording Mechanism
Output Data
Examples
1. Satellite Sensors:
o Landsat satellites use multispectral sensors to monitor vegetation,
water, and urban growth.
o MODIS captures data for climate studies, including ocean color and
atmospheric properties.
2. Thermal Imaging:
o Sensors detect emitted infrared energy to map surface temperatures,
aiding in volcanic monitoring or urban heat island studies.
1. Transmission:
o Data captured by the sensor is sent to ground stations via
communication links.
o In the case of spaceborne sensors, satellite signals are transmitted to
Earth using radio waves.
o For aerial or UAV platforms, data might be stored onboard and
downloaded later.
2. Reception:
o Ground stations receive the transmitted data.
o Signal strength, noise, and atmospheric interference can impact the
quality of the received data.
Processing
Raw data from sensors is not directly usable; it must be processed into a readable
and analyzable format. This involves several steps:
1. Pre-Processing:
o Radiometric Corrections: Adjusts for variations in sensor sensitivity
and atmospheric conditions to ensure accurate energy readings.
o Geometric Corrections: Aligns the captured data with real-world
coordinates to correct distortions caused by the sensor's position or
Earth's curvature.
o Cloud Removal: Filters out cloud-covered areas, especially in optical
imagery.
2. Data Conversion:
o Converts raw digital signals into image formats or numerical datasets
(e.g., raster files).
o Spectral data is organized into bands for multispectral or
hyperspectral analysis.
3. Enhancement:
o Enhances data for better visual interpretation, such as adjusting
contrast or applying false-color composites.
o Example: Displaying vegetation in infrared instead of visible light.
4. Data Integration:
o Combines data from multiple sensors or sources for comprehensive
analysis.
o Example: Combining optical and thermal imagery for agricultural
monitoring.
Interpretation
1. Visual Interpretation:
o An analyst examines images to identify patterns, features, and
changes.
o This method is suitable for qualitative analysis, such as recognizing
vegetation cover or water bodies.
2. Automated Interpretation:
o Algorithms and software classify data based on spectral signatures
and spatial patterns.
o Techniques include:
Supervised Classification: Training the algorithm on known
data points.
Unsupervised Classification: Allowing the algorithm to group
data into clusters based on similarities.
Analysis
2. Temporal Analysis:
o Studies changes over time using time-series data.
o Example: Monitoring deforestation rates annually.
3. Spectral Analysis:
o Uses the reflectance or emission characteristics of objects to identify
their properties.
o Example: Differentiating healthy vegetation from stressed plants
using near-infrared bands.
Introduction
Remote sensing is classified based on the source of energy and the type of sensors
used. Understanding the characteristics of the images captured by sensors is crucial
for interpreting and analyzing data effectively.
2. Based on Platform
Characteristics of Images
1. Spatial Resolution
2. Spectral Resolution
3. Temporal Resolution
4. Radiometric Resolution
Represents the sensitivity of a sensor to detect variations in energy
intensity.
Higher radiometric resolution captures more subtle differences (e.g., 12-bit
data distinguishes 4,096 intensity levels).
Remote sensing satellites are platforms equipped with sensors to collect data about
the Earth's surface from space. They orbit the Earth in specific patterns, enabling
consistent and large-scale observations.
1. Geostationary Satellites:
o Orbit at ~36,000 km above the equator and remain stationary relative
to a point on Earth.
o Provide continuous monitoring of a specific area.
o Commonly used for weather forecasting and communication (e.g.,
INSAT, GOES).
2. Polar-Orbiting Satellites:
o Orbit at lower altitudes (~700–800 km) and pass over the poles.
o Cover the entire Earth as the planet rotates beneath them.
o Examples: Landsat, NOAA satellites.
o Applications: Environmental monitoring, agriculture, and disaster
assessment.
3. Sun-Synchronous Satellites:
o Maintain a consistent angle with the Sun, ensuring uniform lighting
conditions for images.
o Ideal for comparing data over time (e.g., Sentinel-2, MODIS).
Landsat: Multispectral and thermal imaging for land cover and vegetation
analysis.
Sentinel-2: High-resolution optical imaging for agricultural and forestry
studies.
RADARSAT: Active radar systems for topography and flood mapping.
MODIS (on Terra/Aqua): Moderate resolution imaging for global
environmental monitoring.
Sensor Resolutions
Sensor resolution determines the quality and applicability of remote sensing data.
It is categorized into four types:
1. Spatial Resolution
Refers to the size of the smallest object that a sensor can detect.
High Spatial Resolution: Small pixels (e.g., 1–5 m), suitable for detailed
mapping (e.g., urban areas).
o Examples: IKONOS, WorldView.
Low Spatial Resolution: Large pixels (e.g., 250–1,000 m), ideal for large-
scale phenomena like weather patterns.
o Examples: MODIS, AVHRR.
2. Spectral Resolution
4. Radiometric Resolution
3. Cost-Effective:
o UAVs are generally more affordable than aircraft or satellites, making
them suitable for small- to medium-scale projects.
o They do not require complex infrastructure and can be operated by a
single individual, reducing overall operational costs.
1. RGB Cameras:
o Standard cameras that capture high-resolution visible light images.
o Commonly used for photogrammetry and creating orthomosaic
maps, useful in land use/land cover mapping and 3D modeling.
2. Multispectral Sensors:
o Capture data in multiple spectral bands, including near-infrared (NIR)
and red-edge bands.
o Vital for precision agriculture, where it helps in monitoring plant
health, vegetation stress, and soil properties.
3. Thermal Cameras:
o Capture infrared radiation emitted by objects, allowing for
temperature mapping.
o Useful in applications like fire monitoring, energy audits, and
ecological studies.
5. Hyperspectral Sensors:
o Capture hundreds of narrow spectral bands across the
electromagnetic spectrum.
o Used for more detailed material identification and classification, such
as mineral exploration, vegetation type mapping, and environmental
monitoring.
1. Agriculture:
o Precision Agriculture: UAVs equipped with multispectral sensors help
monitor crop health, detect nutrient deficiencies, assess irrigation
needs, and optimize pesticide application.
o Field Mapping: Regular monitoring of crop progress, including early
detection of diseases or pests.
2. Forestry:
o Tree Inventory and Health Assessment: UAVs can generate 3D maps
of forests, helping in biomass estimation, tree species identification,
and monitoring forest health.
o Deforestation Monitoring: UAVs can track changes in forest cover
over time, providing real-time data for conservation efforts.
3. Disaster Management:
o Search and Rescue: UAVs can quickly assess disaster sites, such as
flood or earthquake zones, providing real-time imagery for rescue
operations.
o Damage Assessment: UAVs capture high-resolution imagery post-
disaster, helping officials assess the extent of damage and prioritize
recovery efforts.
5. Environmental Monitoring:
o Habitat Mapping: UAVs can map ecosystems in fine detail, identifying
habitats of endangered species and assessing biodiversity.
o Pollution Monitoring: UAVs can detect air or water pollutants,
providing data for environmental protection and remediation efforts.
6. Urban Planning:
o Urban Growth Monitoring: UAVs can help urban planners track land
development, zoning changes, and urban sprawl over time.
o 3D Modeling: UAVs create detailed 3D models of urban landscapes,
which can be used for infrastructure planning, traffic management,
and city development.