0% found this document useful (0 votes)
9 views

Short Notes

It's about geometric data correction

Uploaded by

kashishsingh5829
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

Short Notes

It's about geometric data correction

Uploaded by

kashishsingh5829
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Geometric Correction

Geometric correction is undertaken to avoid geometric distortions from a distorted


image, and is achieved by establishing the relationship between the image coordinate
system and the geographic coordinate system using calibration data of the sensor,
measured data of position and attitude, ground control points, atmospheric condition
etc.

The steps to follow for geometric correction are as follows

(1) Selection of method:

After consideration of the characteristics of the geometric distortion as well as


the available reference data, a proper method should be selected.

(2) Determination of parameters:

Unknown parameters which define the mathematical equation between the


image coordinate system and the geographic coordinate system should be
determined with calibration data and/or ground control points.

(3) Accuracy check:

Accuracy of the geometric correction should be checked and verified. If the


accuracy does not meet the criteria, the method or the data used should be
checked and corrected in order to avoid the errors.

(4) Interpolation and resampling:

Geo-coded image should be produced by the technique of resampling and


interpolation. There are three methods of geometric correction as mentioned
below.

a. Systematic correction:

When the geometric reference data or the geometry of sensor are given or
measured, the geometric distortion can be theoretically or systematically
avoided. For example, the geometry of a lens camera is given by the collinearity
equation with calibrated focal length, parameters of lens distortions, coordinates
of fiducial marks etc. The tangent correction for an optical mechanical scanner
is a type of system correction. Generally systematic correction is sufficient to
remove all errors.

b. Non-systematic correction:

Polynomials to transform from a geographic coordinate system to an image


coordinate system, or vice versa, will be determined with given coordinates of
ground control points using the least square method. The accuracy depends on
the order of the polynomials, and the number and distribution of ground control
points.

c. Combined method:

Firstly the systematic correction is applied, then the residual errors will be
reduced using lower order polynomials. Usually the goal of geometric correction
is to obtain an error within plus or minus one pixel of its true position

Radiometric Correction
As any image involves radiometric errors as well as geometric errors, these errors
should be corrected. Radiometric correction is to avoid radiometric errors or
distortions, while geometric correction is to remove geometric distortion.

When the emitted or reflected electro-magnetic energy is observed by a sensor on


board an aircraft or spacecraft, the observed energy does not coincide with the
energy emitted or reflected from the same object observed from a short distance.
This is due to the sun's azimuth and elevation, atmospheric conditions such as fog
or aerosols, sensor's response etc. which influence the observed energy. Therefore,
in order to obtain the real irradiance or reflectance, those radiometric distortions
must be corrected.

Radiometric correction is classified into the following three types

(1) Radiometric correction of effects due to sensor sensitivity


In the case of optical sensors, with the use of a lens, a fringe area in the corners will
be darker as compared with the central area. This is called vignetting. Vignetting
can be expressed by cos , where is the angle of a ray with respect to the optical
axis. n is dependent on the lens characteristics, though n is usually taken as 4. In the
case of electro-optical sensors, measured calibration data between irradiance and the
sensor output signal, can be used for radiometric correction.

(2) Radiometric correction for sun angle and topography

➢ Sun spot: The solar radiation will be reflected diffusely onto the
ground surface, which results in lighter areas in an image. It is
called a sun spot. The sun spot together with vignetting effects
can be corrected by estimating a shading curve which is
determined by Fourier analysis to extract a low frequency
component

➢ Shading: The shading effect due to topographic relief can be


corrected using the angle between the solar radiation direction
and the normal vector to the ground surface.

(3) Atmospheric correction: Various atmospheric effects cause absorption and


scattering of the solar radiation. Reflected or emitted radiation from an object
and path radiance (atmospheric scattering) should be corrected for.

Unsupervised classification is where the outcomes (groupings of pixels with


common characteristics) are based on the software analysis of an image without the
user providing sample classes. The computer uses techniques to determine which
pixels are related and groups them into classes. The user can specify which algorism
the software will use and the desired number of output classes but otherwise does
not aid in the classification process. However, the user must have knowledge of the
area being classified when the groupings of pixels with common characteristics
produced by the computer have to be related to actual features on the ground (such
as wetlands, developed areas, coniferous forests, etc.).

Supervised classification is based on the idea that a user can select sample
pixels in an image that are representative of specific classes and then direct the image
processing software to use these training sites as references for the classification of
all other pixels in the image. Training sites (also known as testing sets or input
classes) are selected based on the knowledge of the user. The user also sets the
bounds for how similar other pixels must be to group them together. These bounds
are often set based on the spectral characteristics of the training area, plus or minus
a certain increment (often based on "brightness" or strength of reflection in specific
spectral bands). The user also designates the number of classes that the image is
classified into. Many analysts use a combination of supervised and unsupervised
classification processes to develop final output analysis and classified maps.

LANDSAT
The LANDSAT system of remote sensing satellites is currently operated by the
EROS Data Center of the United States Geological Survey. This is a new
arrangement following a period of commercial distribution under the Earth
Observation Satellite Company (EOSAT) which was recently acquired by Space
Imaging Corporation. As a result, the cost of imagery has dramatically dropped, to
the benefit of all. Full or quarter scenes are available on a variety of distribution
media, as well as photographic products of MSS and TM scenes in false color and
black and white. There have been seven LANDSAT satellites, the first of which was
launched in 1972. The LANDSAT 6 satellite was lost on launch. However, as of this
writing, LANDSAT 5 is still operational. LANDSAT 7 was launched in April, 1999.
LANDSAT carries two multispectral sensors. The first is the Multi-Spectral Scanner
(MSS) which acquires imagery in four spectral bands: blue, green, red and near
infrared. The second is the Thematic Mapper (TM) which collects seven bands:
blue, green, red, near-infrared, two mid-infrared and one thermal infrared. The MSS
has a spatial resolution of 80 meters, while that of the TM is 30 meters. Both sensors
image a 185 km wide swath, passing over each day at 09:45 local time, and returning
every 16 days. With LANDSAT 7, support for TM imagery is to be continued with
the addition of a co-registered 15 m panchromatic band.

IRS
The Indian Space Research Organization currently has 5 satellites in the IRS system,
with at least 7 planned by 2004. These data are distributed by ANTRIX Corp. Ltd
(the commercial arm of the Indian Space Research Organization), and also by Space
Imaging Corporation in the United States. The most sophisticated capabilities are
offered by the IRS-1C and IRS-1D satellites that together provide continuing global
coverage with the following sensors: IRS-Pan: 5.8 m panchromatic.
Fiducial Marks: Index marks, rigidly connected at the central or corner edges of
the camera body. When the film is exposed, these marks appear on the film negative.

Principal Point: The foot of the perpendicular drawn from the camera lens
centre on the photo plane.

Principal Distance: The perpendicular distance from the perspective centre to


the plane of the photograph.

Digital Image Processing: The numerical manipulation of DN values for


the purpose of extracting information about the phenomena of the surface
they represent.

Band: The specific wavelength interval in the electromagnetic spectrum.

Digital image: An array of digital numbers (DN) arranged in rows and


columns, having the property of an intensity value and their locations.

False Colour Composite (FCC): An artificially generated colour image


in which blue, green and red colours are assigned to the wavelength
regions to which they do not belong in nature. For example, in standard a
False Colour Composite blue is assigned to green radiations (0.5 to 0.6
μm), green is assigned to red radiations (0.6 to 0.7 μm and red is assigned
to Near Infrared radiation (0.7 to 0.8 μm).

Spectral Band: The range of the wavelengths in the continuous spectrum


such as the green band ranges from 0.5 to .6 μ and the range of NIR band
0.7 to 1.1 μ.

(i) Spatial Resolution: You must have seen some people using
spectacles while reading a book or newspaper. Have you ever
thought as to why they do so. It is simply because of the fact
that resolving power of their eyes to differentiate two closed
spaced letters in a word is unable to identify them as two
different letters. By using positive spectacles, they try to
improve their vision as well as the resolving power. In remote
sensing, the spatial resolution of the sensors refers to the same
phenomena. It is the capability of the sensor to distinguish two
closed spaced object surfaces as two different object surfaces.
As a rule, with an increasing resolution the identification of
even smaller object surfaces become possible.

What is Remote Sensing?


Remote sensing is the science and art of acquiring information about the
Earth's surface without actually being in contact with it. This is done by
sensing and recording reflected or emitted energy from earth surface and
processing, analyzing, and applying that data.

“Remote sensing is the practice of deriving information about the earth’s


land and water surface using images acquired from an overhead
perspective, using electromagnetic radiation in one or more regions
of the electromagnetic spectrum, reflected or emitted from the earth’s
surface.” (Campbell et.al., 2011).

Satellite Technology is an example of remote sensing. Of our five


senses, sight, hearing and smell, may be considered as forms of remote
sensing.

Spectral resolution

Spectral resolution is defined through the number of spectral bands and their width.
Their purpose is to capture the differences in the reflection characteristics of
different surfaces.

While the human eye only recognizes the visible spectrum of light, a satellite
can, depending on the type, depict radiation differently in many spectral areas. The
majority of passive earth observation satellites have between three and eight bands
and are therefore called multispectral as for example the American LANDSAT and
the French SPOT.
The higher the spectral resolution, the narrower is the wavelength range for a specific
band, and therefore, the more bands there are. With a higher spectral
resolution single objects can be perceived better and spectrally distinguished.

Visible light: In the area of visible light passive satellite sensors are as sensitive as
the human eye. Satellites "see" about the same as a person would see when looking
at the earth from an altitude of about 1,000 km. The satellites only capture what is
being lit by the sun.

Infrared sensors measure radiation in the near, middle, and far (thermal) infrared.
The data can be converted to temperatures of the land and ocean surface in the cloud-
free conditions, and to the temperature at the top of clouds during overcast.

Panchromatic sensors detect broadband light in the entire visible range, and signal
intensities are displayed as grey levels, i.e., black and white imagery.

Radiometric resolution

The radiometric resolution specifies how well the differences in brightness in an


image can be perceived; this is measured through the number of the grey value
levels. The maximum number of values is defined by the number of bits (binary
numbers). An 8 bit representation has 256 grey values, a 16 bit (ERS satellites)
representation 65.536 grey values.

The finer or the higher the radiometric resolution is, the better small differences
in reflected or emitted radiation can be measured, and the larger the volume of
measured data will be (compare with the image on the right).

The advantage of a higher radiometric resolution is rather small - when comparing


LANDSAT-MSS (6 bits) and TM (8 bits) the improvement is in the order of 2-3%.

Radiometric resolution depends on the wavelengths and the type of the spectrometer:

• LANDSAT-MSS (from LANDSAT 1-3): 6 bits (64 grey values)


• IRS-LISS I-III: 7 bits (128 grey values)
• LANDSAT-TM (from LANDSAT 4-5) & SPOT-HRV: 8 bits (256 grey
values)
• LANDSAT-ETM & ETM+ (from LANDSAT 6-7): 9 bits (only 8 bits are
transmitted)
• IRS-LISS IV: 10 bits (only 7 bits are transmitted)
• IKONOS & QuickBird: 11 bits.

Thermal Sensor
Many multispectral (MSS) systems sense radiation in the thermal
infrared as well as the visible and reflected infrared portions of the
spectrum. However, remote sensing of energy emitted from the Earth's
surface in the thermal infrared (3 μm to 15 μm) is different than the
sensing of reflected energy. Thermal sensors use photo detectors
sensitive to the direct contact of photons on their surface, to detect
emitted thermal radiation. The detectors are cooled to temperatures
close to absolute zero in order to limit their own thermal emissions.
Thermal sensors essentially measure the surface temperature and
thermal properties of targets. Thermal imagers are typically across-
track scanners (like those described in the previous section) that detect
emitted radiation in only the thermal portion of the spectrum. Thermal
sensors employ one or more internal temperature references for
comparison with the detected radiation, so they can be related to
absolute radiant temperature. The data are generally recorded on film
and/or magnetic tape and the temperature resolution of current sensors
can reach 0.1 °C. For analysis, an image of relative radiant
temperatures (a thermogram) is depicted in grey levels, with warmer
temperatures shown in light tones, and cooler temperatures in dark
tones.

Map Layout is the assembling of the various elements of a map into a


single whole, including the map itself, its legend, title, scale bars, and
other elements.

You might also like