0% found this document useful (0 votes)
9 views

RS T9 Classification

Uploaded by

Shahwaiz Munir
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

RS T9 Classification

Uploaded by

Shahwaiz Munir
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 61

Remote Sensing

Classification

Image from NASA:


https://commons.wikimedia.org/wiki/File:Earth_Wester
n_Hemisphere_transparent_background.png#filelinks
Introduction

2
Image Classification

Image Bands as Predictor


Variables  Thematic Classes

3
Classification Tasks

Land cover classification


Forest type classification
Vegetation classification

4
National Land Cover Database (NLCD)

http://www.mrlc.gov/

5
National Land Cover Database (NLCD)

1992, 2001, 2006, 2011, 2016, 2019


Land cover, land cover change,
impervious surface, canopy cover
http://www.mrlc.gov/

6
High Spatial Resolution Land Cover

Chesapeake Bay Land Cover Data Project


http://chesapeakeconservancy.org/

https://wvgis.wvu.edu/ 7
Dynamic World
Brown, C.F., Brumby, S.P., Guzder-
Williams, B., Birch, T., Hyde, S.B.,
Mazzariello, J., Czerwinski, W.,
Pasquarella, V.J., Haertel, R.,
Ilyushchenko, S. and Schwehr, K.,
2022. Dynamic World, Near real-
time global 10 m land use land
cover mapping. Scientific
Data, 9(1), pp.1-17.

8
Supervised vs. Unsupervised

Both require user input


Supervised requires training data
upfront
Unsupervised requires the
analyst to label the generated
clusters

9
Implementations

Erdas Imagine
ENVI
ArcGIS Pro
eCognition
R
Python
QGIS/Orfeo Toolbox

10
Spectral Classes vs. Informational Classes

Fresh
blacktop
Old blacktop

Examples from Dr. Tim Warner Concrete 11


Unsupervised Classification

12
Image Bands  Spectral Classes  Informational Classes

13
k-Means

Analyst defines the number of clusters


Algorithm arbitrarily “seeds” or locates the
cluster centers in the multidimensional
measurement space
Pixels are assigned to clusters
Mean vectors for the clusters are
determined
Process is re-ran
Process continues until mean vectors don’t
change significantly

14
k-Means vs. ISODATA
Iterative Self-Organizing Data
Analysis Technique
 k-means: number of clusters remains the same
throughout the iterations, although it may turn
out later that more or fewer clusters would fit
the data better.
 ISODATA: allows the number of clusters to be
adjusted automatically during the iterations by
merging similar clusters and splitting clusters
with large standard deviations.
 Parameters
 Desired number of clusters
 Minimum number of samples in each cluster (for
discarding clusters)
 Maximum variance (for splitting clusters)
 Minimum pairwise distance (for merging clusters) 15
k-Means
Random Seed Iteration 1
x x
x x x
x x x
+
Predictor 2

x
x x x x
x x x x
x x
x x x x
x +
+
x x x
+

Predictor 1 Iteration 2 Iteration 3


x x
x x
+ x + x
x x x
x x x
x x x
+ + x
+
x x
+ 16
k-Means 1

1 2 3 4 5
Iteration

17
Video: Unsupervised Classification in ArcGIS Pro

18
Supervised Classification

19
Supervised Learning Process

Entire Population
Spatial Model
Training Samples

Spatial Model Predict to


Creation Unknowns

Predictor
Variables
Model Validation
Validation
Samples

20
Training Samples

Spatially explicit
Based on available data, photo
interpretation, and/or field work
Representative of the classes
Provide an adequate number of
samples
Provide and adequate number of
samples per class
Accurate

21
Minimum-Distance-to-Means

Relies on class means


Mean or average spectral value in each band is
found
The band means comprise the mean vector
New sample is assigned to the class that it is
closest to it in the feature space based on the
class mean vector
Can specify a maximum distance to create a
“unknown” class
Does not take into account class variability
Generally, not good if classes are close
together or overlapping in the spectral space

22
Minimum-Distance-to-Means

Relies on class means


Mean or average spectral value in each band is
found
The band means comprise the mean vector
New sample is assigned to the class that it is
closest to it in the feature space based on the
class mean vector
Can specify a maximum distance to create a
“unknown” class
Does not take into account class variability
Generally, not good if classes are close
together or overlapping in the spectral space

23
Minimum-Distance-to-Means
Relies on class means
Mean or average spectral value in each band is
found
The band means comprise the mean vector
New sample is assigned to the class that it is
closest to it in the feature space based on the
class mean vector
Can specify a maximum distance to create a
“unknown” class
Does not take into account class variability
Generally, not good if classes are close
together or overlapping in the spectral space

24
Parallelepiped

Relies on class range/variance


New samples that fall within the range
of a class are assigned to that class
If a new sample is not within the data
range, then it is classified as
“unknown”
If a new sample is in an overlapping
area, it will be classified as “unsure” or
be arbitrarily placed in one of the
classes
Correlation between bands can be an
issue
Can incorporate stepped boundaries
25
Parallelepiped

Relies on class range/variance


New samples that fall within the range
of a class are assigned to that class
If a new sample is not within the data
range, then it is classified as
“unknown”
If a new sample is in an overlapping
area, it will be classified as “unsure” or
be arbitrarily placed in one of the
classes
Correlation between bands can be an
issue
Can incorporate stepped boundaries

26
Parallelepiped

Relies on class range/variance


New samples that fall within the range
of a class are assigned to that class
If a new sample is not within the data
range, then it is classified as
“unknown”
If a new sample is in an overlapping
area, it will be classified as “unsure” or
be arbitrarily placed in one of the
classes
Correlation between bands can be an
issue
Can incorporate stepped boundaries

27
Gaussian Maximum Likelihood
 Take into account class mean and
covariance
 Assumes normal distribution
 Uses mean vector and covariance matrix
 Creates a probability density matrix
 Calculates statistical probability of a given
pixel being a member of a particular
category
 Assigned to class of highest probability
 Can assign to an “unknown” class if all
probabilities are below a threshold

Examples from Dr. Tim Warner 28


Gaussian Maximum Likelihood
 Take into account class mean and
covariance
 Assumes normal distribution
 Uses mean vector and covariance matrix
 Creates a probability density matrix
 Calculates statistical probability of a
given pixel being a member of a
particular category
 Assigned to class of highest probability
 Can assign to an “unknown” class if all
probabilities are below a threshold

Examples from Dr. Tim Warner 29


Comparisons

30
Examples from Dr. Tim Warner
Machine Learning

31
Machine Learning Process

Thing you
want to Machine
predict Learning Trained Model
Algorithm

Things you
think might
help you
predict the New things
new thing Predictions
to predict

Machine Learning = Learning from Examples 32


Machine Learning

Nonparametric
Supervised learning
Learn from training samples
Generalize to unknown samples
Generally robust
Can model complex class signatures
Can use a variety of predictor variables

33
Machine Learning

Artificial Neural Networks (ANN)


k-Nearest Neighbor (kNN)
Classification and Regression Trees (CART)
Boosted CART
Random Forests (RF)
Support Vector Machines (SVM)

34
k-Nearest Neighbor (k-NN)

Compare new sample to nearest


training sample(s) in the feature
space
k = number of neighbors compared
to
Assign new sample to majority of
the neighbors (classification)
Use average or distance weighted
average of neighbors (regression)
Proportion of neighbors by class
(probability)

35
Random Forests (RF)

Uses decision trees Data Tree


Uses the Gini Index of Impurity Subset 1

Ensemble decision tree method


Data Tree
Uses random subset of predictor Subset 2
Data Take the
variables for splitting at each majority
node vote
Data Tree
Uses random subset of training Subset 3
data in each tree
Attempts to reduce correlation
between trees Data Tree
Subset 4
Ensemble of weak classifiers
Subset of predictors used in each tree 36
Support Vector Machines (SVM)
Works with boundary conditions
Find the linear boundary that gives the best
separation between classes (hyperplane)
Boundary defined by support vectors, or
training samples that are nearest to the
boundary
Project the data to a higher dimension
Two class problem
Combine multiple classifications to
separate more than three classes
Non-separable data
Positive slack variable (Cortes and Vapnik
(1995))
Cost parameter (C)
http://en.wikipedia.org/wiki/Support_vector_machine

37
The Kernel Trick

https://upload.wikimedia.o
rg/wikipedia/commons/b/
bc/Wiki_gauss.png

Use the kernel trick to transform the feature space into a higher
dimensional space where the features are more linearly separable
38
Machine Learning in Remote Sensing

Maxwell, A.E., Warner, T.A. and Fang, F., 2018.


Implementation of machine-learning classification in
remote sensing: An applied review. International Journal
of Remote Sensing, 39(9), pp.2784-2817.

https://www.tandfonline.com/doi/full/10.1080/01431161
.2018.1433343

39
Deep Learning (DL)

Image Semantic Segmentation

Scene Label = Cat Instance Segmentation


40
Video: Collect Training Data in ArcGIS Pro

41
Video: Supervised Classification in ArcGIS Pro

42
Additional Topics

43
Object-Based Classification (GEOBIA)

Geographic Object-Based Image


Analysis
Commonly used for high spatial
resolution data
Segment image into objects or
polygons
Classify objects as opposed to pixels
Help remove salt-and-pepper effect
Potentially a more cartographically
pleasing result
Incorporate ancillary data, textural
measures, and geometric measures
44
GEOBIA Example

45
GEOBIA Example

46
Some Complexities

Not all classes are spectrally separable


Data can be suboptimal
Training data can be difficult to collect
It is important to consider the impact of scale
Classes may have complex signatures
Classes may have fuzzy definitions
Pixels can be “mixed”
There can be positional errors in your data

47
Improving your results

Re-execute with more training data Incorporate spectral


enhancement/ratios
Combine confused classes
Incorporate measures of texture
Incorporate ancillary data
Use multi-temporal data
Use a more powerful algorithm
Use multi-seasonal data
Smooth or generalize results
Manually clean up results
Make use of object-based
classification Consider defining a Minimal
Mapping Unit (MMU)
Use a different image

48
Sieving

Remove areas smaller than a certain size or area


Good for generalizing results
Good for mapping using a minimum mapping unit
(MMU)

49
Example of Class Separability

50
Example 1

Landsat 5 TM October 18, 2003

51
Class Comparison in Red and NIR Bands

52
Class Comparison in All Bands

53
Example 2

Landsat 5 TM August 23, 2011


54
Class Comparison in Red and NIR Bands

55
Class Comparison in All Bands

56
Example 3

QuickBird Image of Morgantown

57
Class Comparison in Red and NIR Bands

58
Class Comparison in All Bands

59
Result

60
This is the end of this lecture module.

Please return to the West Virginia View


Webpage for additional content.

61

You might also like