0% found this document useful (0 votes)
253 views

144 Spatial Data Quality

The document discusses spatial data quality and defines key terms like accuracy, completeness, consistency, errors and their impact on data quality. It explains factors that affect data quality like currency, accessibility, sources of errors and how they can be minimized to maintain highest data quality.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
253 views

144 Spatial Data Quality

The document discusses spatial data quality and defines key terms like accuracy, completeness, consistency, errors and their impact on data quality. It explains factors that affect data quality like currency, accessibility, sources of errors and how they can be minimized to maintain highest data quality.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 33

Spatial Data Quality

SPATIAL DATA QUALITY


by
William Rasdorf
March 15, 2000

WHAT IS DATA QUALITY?


Data quality is an important factor in the process of effectively and accurately
conveying information. The term data quality can be misleading, giving rise to
ambiguity by interpretation. The US Spatial Data Transfer Standard defines
quality as:
Quality is an essential or distinguishing characteristic necessary
for cartographic data to be fit for use.
Data quality can vary from person to person, organization to organization, or
from application to application. It becomes the responsibility of the user to
decide if a data set is sufficient to meet given quality requirements, and the
standard may differ by application or use. Data quality may be appropriate for
one project but not necessarily suitable for another.
The US Spatial Data Transfer Standard lists five components of a data set which
provide the information necessary to assess data suitability for use.
Data lineage
Positional accuracy
Attribute accuracy
Logical consistency
Completeness
Data lineage: describes the source of the derived data, derivation methods, and
all transformations employed in producing the final data. It must also include the
specific control points used.
Positional accuracy: compares spatial data to an independent and more
accurate source. This should include the positional accuracy of geodetic control
points and the accuracy of data after all transformations.
Attribute accuracy: may include deductive estimates or may be based on
independent samples from polygon overlay. Should differentiate between
original and derived attributes.

Page 1

Spatial Data Quality

Logical consistency: determines the faithfulness of data structures embedded


in a transfer file. It should verify the topological and network coverage.
Completeness: A data set is complete with regards to aspects such as minimum
area employed in polygon construction, gaps in either the data element set or
attribute values, etc. Completeness also refers to the aspects of the data set
that characterize it as a whole and not as a specific or individual element.

ERRORS IN GIS - HOW THEY AFFECT DATA QUALITY


Errors may creep in at any stage of data acquisition and transformation. It is
therefore imperative to control the errors at each and every transition from
observation to the presentation stage and to see to it that the quality is
maintained.
Inaccuracy due to errors may be encountered during primary (data collected
directly by aerial and terrestrial surveying and satellite imagery) or secondary
methods of collection (data is collected indirectly from charts, maps, graphs,
etc.) of data.
Errors encountered in primary methods of data collection may be classified as
Personal errors
Instrumental errors
Environmental errors
Errors encountered in secondary methods of data collection may be classified
as
Errors in plotting control
Compilation error
Error in drawing
Error in map generalization
Error in map reproduction
Deformation of the material
Error introduced due to the use of the wrong scale
Uncertainty in the definition of a feature
Error due to feature exaggeration
Error in digitizing or scanning
Additionally, all the errors encountered in primary methods are also encountered
in secondary methods.
Due to the errors that creep in to the data at each and every stage of data
acquisition or transformation, inaccuracy (deviation from the true value)

Page 2

Spatial Data Quality

increases and leads to the deterioration of data quality. Though errors are an
inherent part of the data in GIS, care should be taken at each and ever step, so
as to minimize the data error and keep the data quality of the highest order.

GENERAL SPATIAL DATA INFORMATION


According to [Ries 1993] spatial information can be divided into three primary
architectures: data, function, and organization. Data architecture describes
the activity performed between two types of data. Organization architecture
describes objects used as input to, or created by, a function. Functional
architecture describes the mission, policies, and rules, which determine and
shape the former two. By exploring each of these architectures, one can
develop a framework to establish spatial design requirements.
Spatial data can be displayed in different ways: point, line, polygon, surface,
volume, and pixel. Each of these display mechanisms has three components:
location, shape, and topology. These components help define the scope of the
design requirements for each Location Control Management level. Along with
data, its properties are equally important. Data properties describe the quality,
or condition of the spatial data.
Traditional definitions for GIS incorporate several functions that encompass the
spatial features life cycle. Retrieval, integration, analysis, and display/report
implement this life cycle approach.
Organization provides the key support to any data and functional design
needed for data integration.

FACTORS AFFECTING DATA QUALITY


1) Currency
a) Are data up to date?
b) Time series
2) Completeness
a) Feature or Entity completeness
i) Data completeness
ii) Model completeness
b) Attribute completeness
i) Value completeness

Page 3

Spatial Data Quality

3) Consistency
a) Map scale
b) Standard descriptions
c) Relevance
4) Accessibility
a) Format
b) Copyright
c) Cost
5) Accuracy and precision
a) Lineage when collected, by whom, how?
b) Density of observations
c) Positional accuracy
d) Attribute accuracy qualitative and quantitative
e) Temporal accuracy
6) Sources of errors in data
a) Data entry or output faults
b) Choice of original data model
c) Natural variation and uncertainty in boundary location and topology temporal error
d) Observer bias
e) Processing
i) Numerical errors in the computer
ii) Limitations of computer representations of numbers
7) Sources of errors in derived data and in the results of modelling and analysis
a) Problems associated with map overlay
b) Classification and generalization problems
c) Choice of analysis model
d) Misuse of logic
e) Error propagation
f) Method used for interpolation
g) Lack of consistency in different analysis of the same data.

Completeness
According to the Spatial Data Transfer Standard, Completeness must describe
the relationship between the objects represented and the abstract universe of all
such objects (SDTS 1997). The selection criteria, definitions, and all the
objects collectively describe completeness. The abstract universe and its
relationship with the database must be precisely described. The abstract
universe can be defined in terms of a desired degree of abstraction and
Page 4

Spatial Data Quality

generalization, i.e. a concrete description or specification for a database (Data


Quality Parameters, H Veregin).
Completeness can be defined in terms of commission or omission of error, and
divided into two broad types: Feature (or Entity) completeness and Attribute
completeness. Feature completeness can be further divided into data and
model completeness and can be defined over space, time, or theme.
Data completeness acts as a check on data quality. It is defined as the
measurable error or offset observed between the database and the specification.
If the database contains all the objects with their specifications, the data set is
considered complete. Model completeness is referred to as the agreement
between the database and the abstract universe required for a particular
database application. Because this varies from application to application it is an
aspect of fitness for use.
Attribute completeness is the degree to which all relevant attributes of a feature
have been encoded. Value completeness is the degree to which all the values
of attributes are present.

Consistency
Consistency refers to the harmonious uniformity or agreement among parts of a
database. It is the lack of apparent contradiction in the data. For geospatial data
the term is especially used to specify conformance with certain topological rules
(Kainz 1995). Before consumers use any database it is important to insure that
it has spatial uniformity in itself, such that the points should meet their actual
existence points, lines intersect at nodes, and polygon are exactly bounded by
lines.
Consistency problems arise in spatial data when we have to overlay maps, or
when trying to combine the data of two or more maps into one. The chances of
losing a certain degree of consistency also increase when we try to combine
data from two different maps of different map scale.
Inconsistency also arises due to the poorly defined process of GIS. The entire
process - from the conceptual stage, to the collection of data, to the analysis, to
the actual manipulation and interpretation of data - needs to be standardized
and a detailed description is required. The human factors and the limitations of
instruments used to achieve these transformations result in a certain degree of
inconsistency.
Care should be taken to avoid spatial inconsistency. This can usually be
identified through redundancies in spatial attributes. For example, an entity
Page 5

Spatial Data Quality

might have the attribute California as a state, and Washington as a County.


Non-redundancy implies that there is independence between two attributes such
that meaningful consistency does not exist.
Consistency should also be maintained with regards to the temporary nature of
data, i.e., a place or a location is shown on two different maps in different
places.

Accuracy and Precision


Accuracy is the first consideration when data is analyzed and evaluated.
Questions such as acceptable error limit range, or to what extent the data be
considered to be accurate, are also addressed. Accuracy, precision, and error
all have varying degrees of meaning, each with its own significance.
Error is the difference between the true and estimated values.
Accuracy, in terms of error, is the extent to which an estimated value
approaches the true value. It is generally given by a true value +/- some error.
E.g. 23.5 +/- .05
Precision is the dispersion of data from the actual data. It is estimated in terms
of standard deviation of the observations over the mean. It also refers to the
ability to display numbers to a certain number of decimal digits. E.g., the given
data is correct unto 3 decimal points accuracy is +/- 0.001
Maintaining data accuracy and eliminating error are essential prerequisites to
providing sound information for a variety of uses. Error appears at any stage:
conceptual (perception), in the field, (false data acquisition via theodolite
limitations, GPS Satellites, etc.), and in attribute measurement (due to variation
in the environment, observer bias, poor handling and manipulating of data,
misinterpretation of acquired data, etc.).
Error is inherent and inevitable in GIS data. A GIS dealing with different layers
of data collected from multiple sources, scales, dates, and map projections will
have its complex error propagated even further. The challenge lies in
maintaining the required accuracy and precision.
The word data accuracy is a broad term divided into lineage, positional
accuracy, thematic accuracy, and temporal accuracy.
The following factors should considered for accuracy and precision of any
spatial data:

Page 6

Spatial Data Quality

Lineage when is it collected, by whom, and how?


Lineage in terms of GIS refers to the history of the data, i.e., the description
of the source material from which the data was derived and the methods of
derivation, including all transformations involved in producing the final data.
It is important to know the lineage of the data because most of the errors are
the result of errors in the field. To verify previous results it is imperative to
establish a set of control points while collecting or describing data. These
need to be documented in sufficient detail to allow recovery. Also, the entire
path of transformation should be described completely, so as to avoid
confusion and round-off (or approximation) error in later stages.

Density of observations
In order to have a sufficient proof of spatial data accuracy, a number of
observations should be taken by a different numbers of persons in order to
collect accurate data that is relatively free from errors. While analyzing this
data, observations should be meeting to a point where they nearly coincide.

Positional accuracy
Positional accuracy is defined in terms of the accuracy of geodetic control
points, and the accuracy of the data after all transformations, i.e., a set of
permanent control points used to achieve sufficient accuracy. Positional
accuracy must be determined by comparing the spatial data to an
independent source of higher accuracy. The test for positional accuracy
must be conducted in accordance of the prevalent rules of ASPRS or
NCDCDS.
Position and Accuracy [Ries 1993]
Position and accuracy determine the way in which the geographic
database can ultimately be used. They represent statements of reliability,
confidence, and risk, and can be determined by analytical results. Spatial
analysis requires consistent precision and accuracy within each
geographic layer, because the current statistical sampling method
assumes normality.
Position and Accuracy [Chong et al. 1993]
Positional accuracy can be expressed as two components absolute
positional accuracy, and relative positional accuracy. Absolute positional
accuracy addresses how closely all positions on a map or data layer
match, corresponding to positions of features represented on the ground
in a desired map projection system. Relative positional accuracy of a
map considers how closely all the positions on a map or data layer
represent their corresponding geometrical relationships on the ground.

Page 7

Spatial Data Quality

Attribute accuracy qualitative and quantitative


Attribute or thematic accuracy is defined as the accuracy of the attributes of
its data. If rice fields are marked as wheat fields on a map, the result is a
thematic or attribute error. Attribute or thematic attribute tests can be made
either by deductive estimates or by independent samples from a polygon
overlay.
Quantitative data can be resolved by the precision of the
measurement device. Quantitative attribute accuracy is lacking when a
region with average rainfall of 100 inches is plotted with an instrument that
measures with 0.1 inch precision and makes a mistake by 0.1 inch. For
qualitative or categorical data, resolution is determined in terms of fineness
of categorical definition. A pine tree region marked with conifer trees is an
example of qualitative attribute error.

Temporal Accuracy
Temporal accuracy is defined as that part of the datas error that arises due
to the temporary nature of the data. It is affected by the interaction between
the duration of the recording interval and the rate of change in the event
(Data Quality Parameters, H Veregin). The border of a country may not be
the same 50 years ago that is today, had it not given independence to or
merged with another country. Temporal accuracy is very important for
collecting data and until now has been ignored in data collection and data
testing phases.

The entity-attribute-value model can determine accuracy of a database of a


real world phenomenon (Data Quality Parameters, H Veregin). In this model
real world phenomenon and attributes are the relevant properties of the objects,
and the values are the relative values of quantitative or qualitative
measurements. Error is defined as the discrepancy between the actual and the
true attribute values of the entity. This model can define spatial, temporal and
thematic data.
Data within a GIS is typically treated as having deterministic values, ignoring
spatial, temporal, geometric, and thematic uncertainties. Accuracy should be
perceived as an added dimension in an integrated environment rather than as a
simple attribute of data or metadata. This signals a move towards objective
GIS, where biases caused by unaccounted uncertainties in data are avoided
and the subjectivity of data accuracies is removed.

Locational Accuracy
[Ries 1993] reports that the Wisconsin Department of Transportation Division of
Highways, initiated a project on location control management for strategic
information and business planning. Location Control Management geographic

Page 8

Spatial Data Quality

area encompasses data and functions regarding shape and absolute topology of
spatial features. They found that locational control could be obtained from three
subject areas: geodetic, geographical, and linear.
The geodetic area defines and manages the locational surface. The earths
geoid, latitude-longitude, datums, monuments, coordinate systems, and
projections are examples of geodetic entities.
The geographic area defines and manages the where of parts, or areas of the
surface defined by the geodetic level. Examples of geographic entities are
linear features such as roads, rivers, and rail; and include municipal, parcel,
hydrographic, wetland, and soil boundaries.
The linear area defines and manages the where along geographic parts. Linear
entities include routes, mileposts, reference points, photo log miles and street
addresses.
The geodetic area controls the geographic, which in turn controls the linear.
Each level has its own internal management and utility, where transformations
between internal location schemes are addressed. To pass between two levels,
a transformation would be made from one location area to the other. Each level
can be managed and used independently of the others, but a location can also
be transformed to exploit the management and utility of another level.

Sources of Error in Data Collection


Error in data is inherent and cannot be avoided. In order to minimize error,
information about the origin and the properties of errors should be analyzed. By
the end of the analysis process the original error (which is easy to identify and
eliminate at the first step) has become linear and more complex, and is nearly
impossible to remove. Bad data should, thus, be identified early.
The following are some of the sources where error can originate.
1. Measurement errors
Small errors in instrument calibration, biased instruments, variation in
instruments due to external factors such as sun, wind, etc.
2. Data Acquisition error
Field error Due to the inaccuracy of the surveyor team.
Lab error Misinterpretation of data from the map (because of proper
map overlay) or inconsistency results in data acquired in labs.
3. Error due to selection of the data model
4. Attribute error

Page 9

Spatial Data Quality

Inability to distinguish characteristics of one part of data from another.


Incorrectly interpreting data.
5. Natural variation
Error due to the temporary nature of data (because of variables beyond
human control) that can affect data (such as the sun, wind, etc.).
6. Manipulation of data
This data requires much higher accuracy than the computer can provide;
errors develop while changing the data format and during data exchange.
7. Numerical Processing
Error induced during the entry of raw digital field observation data due to
redundant data. Error injected into the data while subjecting the
redundant data to least square analysis resulting in a coordinate file with
all non-numerical (attribute) information.

Sources of Errors in Derived Data and in the Results of Modeling


and Analysis
Errors can develop in numerous ways, and at any stage in GIS data, particularly
during data analysis and modeling.
Care should especially be taken when overlaying maps having different
characteristics. Due to the generalization of map attributes, accuracy is easily
lost. While analyzing data (and due to certain types of ambiguous data) it is
difficult to determine which classification category data falls into. Since no
models are perfect, and since choosing the analysis model is at the discretion of
the analyzer, the error rate rises if the appropriate model is not selected.
Error can also develop if interpolation is chosen to arrive at an approximate
accuracy. Infinitely sharp boundaries need to be plotted to identify true
boundaries. But because this is nearly impossible values are interpolated and
boundaries are rounded off. The result can be a serious error when the scale of
the map is large. (The figure on page 235 of Principles of GIS illustrates this.)
Analysis reports of the same material in different labs are found to vary widely,
with up to 11% soil sample errors recorded.

ATTRIBUTE INFORMATION
The distance and angles of points are not enough information to define a point.
Because of this, attribute information should also be collected with the locational
data on the site. This attribute information should include items like the
following.

Page 10

Spatial Data Quality

Point feature code: a symbol should appear at the surveyed point.

Feature code table should also contain feature code name, type of
symbol, size of symbol, and color.
Lines or chains: The line should have the feature code, which is user
definable by the type of line (dotted, full, etc.), its thickness and its color.
Ground or non-ground points: these are above ground points which
should be shown differently than ground points, e.g., the overhang part of
the roof should not be shown as part of the ground.
Stationing and offset position: care should be taken to define the
stationing and the left/right offset to increase the accuracy of the fixed
point.
Point name and description: name, description and elevation should be
provided for each point.
Layer or zone: split layers according to zones or layers to avoid gathering
voluminous amounts of different data attributes.

INFORMATION FOR METADATA


Metadata is valuable because it provides users with specific information about
the lineage of a data set that can be used with full knowledge of its source,
quality, and contents.
Adequate data is required for the appropriate,
responsible, and defensible use of any geographic data set. Metadata, which is
data about data, helps us do this.

DATA INTEGRATION AND ANALYSIS


Integration of data often involves combining multiple geographic layers for
analytical purposes. The resulting absolute topology provides the necessary
information to perform spatial analysis functions.

DYNAMIC SEGMENTATION
Dynamic segmentation is a two-step process performed on a spatial data set
comprised of linear features. A route system is first created by associating
adjacent line segments into one or more groups that have a definite linear
sequence. Descriptive information is then associated with the route system by
referencing distances from the starting point of each route. For example, a
stream route system is created by grouping stream segments into routes that
represent the main stream, tributaries, and headwaters. Spawning habitat areas
are then mapped by their locations along the routes.

Page 11

Spatial Data Quality

The advantage of using dynamic segmentation is that small areas along a line
feature can be referenced without actually breaking the line into pieces. Linear
distances, such as river miles, can also be calculated directly from the routes
and their associated attributes.

Page 12

Spatial Data Quality

Annotated Bibliography
This section contains an Annotated Bibliography of papers related to the area of
spatial and locational data quality. Presented below are details of these various
papers including: the title of the paper, journal article, or book chapter; the name
of the author(s); a categorization of what the source document is; a complete
citation for the item; and a brief summary description of the item.

Quick Reference Table for Annotated Bibliography


[Amrhein & Schut 1990]
[Backe 1996]
*

[Beard & Buttenfield 1999]


[Bissex et al. 1990]

*
*

[Burrough & McDonnell 1998]


[Chong et al. 1993]

[Chrisman 1995]
[Dobson 1993]
[Donohoo 1990]

[Elmes & Cai. 1994]


[Fegeas 1992]

[Fisher 1999]
[Garza & Foresman 1991]
[Godden 1996]
[Grady 1990]
[Greve et al. 1993]
[Heuvelink 1999]

Data Quality Standards and Geographic Information


Systems
Formal Spatial Data Standards What are They
and who Does Them.
Detecting and Evaluating Errors by Geographical
Methods
Quality Assurance for Geographic Information
Systems
Errors and Quality Control
A Field Check Sampling Procedure to Evaluate the
Positional Accuracy of Digital Landbases.
Living With Error in Geographic Data: Truth and
Responsibility
Commentary: A Conceptual Framework for
Integrating Remote Sensing, GIS and Geography
Cartographic Quality Control: No longer Optional for
Todays GIS Programs
Structural Reasoning for Spatial Database Accuracy
Assessment
An Overview of FIPS 173, the Spatial Data Transfer
Standard
Models of Uncertainty in Spatial Data
Embedding Quality into Countrywide
Conversion
Quality Control for GIS Conversion Projects

Data

The Lineage of Data in Land and Geographic


Information Systems (LIS/GIS)
Investigating US Geological Survey Needs for the
Management of Temporal GIS Data
Propagation of Error in Spatial Modelling with GIS

Page 13

Spatial Data Quality

[Hintz et al. 1996]


*

[Hunter 1999]

[Hunter & Williamson 1990]


[Kuehlthau 1990]
[Lundin 1989]

*
*

[Mark et al. 1993]


[Matson et al. 1996]

[Newcomb et al. 1993]


[Ng & Shi 1993]

[Ngan 1995]
[Ostman 1996]
[Paradis & Beard 1994]

*
*

[Peng & Dueker 1993]


[Ries 1993]

*
*

[Stefandis & Agouris 1996]


[Thapa & Bossler 1992]

[Veregin 1999]
[Wellar 1972]
[Wong & Wu 1996]
[Worboys 1998]
[Wu & Buttenfield 1994]
[Zhao 1997]

*
*

Trends in next Generation Electronic Survey Data


Collection
New Tools for Handling Spatial Data Quality:
Moving from Academic Concepts to Practical Reality
The Need for a Better Understanding of Spatial
Databases
Data Structures for Data Integration
Data Quality Reporting Methods for Digital
Geographical Products at Statistics Canada
Data Requirements for Route Guidance
Development of Mapping Grade Global Positioning
System Data
Data Requirements for Route Guidance
Integration
of
Qualitative
and Quantitative
Information for Spatial Query.
Digital Quality Control for Manual Digitizing
Operations
Quality Systems for Spatial Data
Visualization of Spatial Data Quality for the
Decision-Maker: A Data Quality Filter
Error and Accuracy in Spatial Data Allocation
Design Requirements for Location as a Foundation
for Transportation Information Systems
Integrated Photogeographic Databases
Accuracy of Spatial Data Used in Geographic
Information Systems
Data Quality Parameters
Standardization: Issues and -----------?
Spatial Metadata and GIS for Decision Support
Computation with Imprecise Geospatial Data
Spatial Data Quality and its Evaluation
Temporal GIS Potentials and Challenges

Page 14

Spatial Data Quality

Title:
Uncertainty in Geographic Data and GIS-Based Analyses
Author:
Category: Web Page (http://www.ncgia.ucsb.edu/other/ucgis/research_priorities/paper9.html )
Complete Citation:
Uncertainty in Geographic Data and GIS-Based Analyses," research
paper of UC Santa Barbara.
Description:
This page contains an article about Uncertainty in Geographic Data
and GIS-Based Analyses," which states how uncertainty propagates
through data analyses based on GIS. It further argues strategies for
identifying, quantifying, tracking, reducing and reporting uncertainty in
geographic data and GIS-based analyses. A standardized means by
which uncertainty can be addressed in daily applications in GIS is
also proposed.

Title:
Data Quality Standards and Geographic Information Systems
Author: Amrhein, C. G. and Schut, P.
Category: Conference Proceedings
Complete Citation:
Amrhein, C. G. and Schut, P., Data Quality Standards and
Geographic Information Systems," Proceedings of National
Conference GIS for the 1990s, Canadian Institute of Surveying and
Mapping, pp. 918-930, March 5-8, 1990.
Description:
This paper discusses the range of errors that can accompany any
data sets, and comprehensive statements of data quality that are
needed by users.
Reference: [Amrhein & Schut 1990]

Title:
Formal Spatial Data Standards What are they and who does them
Author: Backe, K.
Category: Conference Proceedings
Complete Citation:
Backe, K., Formal Spatial Data Standards What are They and who
Does Them. Proceedings of ASPRS/ACSM, Volume 1, Remote
Sensing and Photogrammetry, Baltimore, Maryland, pp. 111, April 2225, 1996.
Description:
A standard is defined by the international Standards Organization
(ISO) to be an agreement containing technical specifications. Spatial
data standards are agreements that precisely specify how real world
things are captured; represented and encoded as digital spatial data;

Page 15

Spatial Data Quality

how this data is described for use; how it is processed and how this
data is exchanged.
Spatial data consumers and producers need standards for spatial
data to avoid the cost associated with duplicative data collection and
exploitation s/w development. Hundreds of standards now exist for
spatial data because until recently, there have been no formally
recognized spatial data standards bodies. Producers and consumers
developed their own standards to satisfy their application or
communitys requirements.
The availability of computers and software will expand exponentially
the already growing appetite for spatial data. The good news is that a
spatial data standards infrastructure has emerged in recent years in
anticipation of a number of professional societies, states and regional
organizations. This process promotes more robust standards that
support a number of communities and applications.
Reference: [Backe 1996]

Title:
Detecting and Evaluating Errors by Geographical Methods
Author: Beard, M. K. and Buttenfield, B. P.
Category: Book Chapter
Complete Citation:
Beard, M. K. and Buttenfield, B. P., Geographic Information Systems
Principles and Technical Issues Volume 1 Detecting and
Evaluating Errors by Geographical Methods," John Wiley & Sons,
Chapter 15, pp. 219-233, 1999.
Description:
This chapter covers detecting and evaluating errors by graphical
methods. It states that since errors are inherent in spatial databases,
the process of observing, measuring, interpreting, classifying and
analyzing data gives rise to systematic and random errors. Casual
users of GIS are unaware of these errors. The author outlines a
rationale for the use of graphical methods, highlights several
historical and recent examples, develops a framework linking error
analysis and graphical methods, and points to research challenges
for the future and the potential for new techniques arising from
technical innovations.
Reference: [Beard & Buttenfield 1999]

Title:
Author:

Quality Assurance for Geographic Information Systems


Bissex, D., Franks, C. and Heitkamp A.

Page 16

Spatial Data Quality

Category: Conference Proceedings


Complete Citation:
Bissex, D., Franks, C. and Heitkamp A., Quality Assurance for
Geographic Information Systems," Proceedings of the Urban and
Regional Information Systems Association Conference, Volume 2,
1990.
Description:
This paper reports how a U.S. Environmental Protection Agency
project team instituted QA/QC standards for the development of GIS
products while studying the impact of waste facilities on the
Environment of New England. A complete QA/QC plan was
developed and broken out into essential components that address
many issues ranging from system documentation, to use this paper as
a reference when developing GIS QA/QC guidelines.
Reference: [Bissex et. al. 1990]

Title:
Errors and Quality Control
Author: Burrough, P. A. and McDonnell, R. A.
Category: Book Chapter
Complete Citation:
Burrough, P. A. and McDonnell, R. A., Principles of Geographic
Information Systems - Errors and Quality Control," Oxford University
Press, Chapter 9, pp. 220-240, 1998.
Description:
One chapter of the book concentrates on the errors that occur in a
spatial data and the effects it may have on spatial data analysis and
modeling. These errors are blunders and gaffs but they are intrinsic
parts of data and computational models. Sources of errors in spatial
data, the factors affecting the reliability of spatial data, and various
methods for estimating errors for quality control purposes are also
presented.
Reference: [Burrough & McDonnell 1998]

Title:

A Field Check Sampling Procedure to Evaluate the Positional


Accuracy of Digital Landbases
Author: Chong, A. K.
Category: Conference Proceedings
Complete Citation:
Chong, A. K., A Field Check Sampling Procedure to Evaluate the
Positional Accuracy of Digital Landbases," Proceedings of
ASPRS/ACSM, Volume 2, Annual Convention & Exposition of
Technical Papers, Seattle, Washington, pp. 1, April 7-10, 1997.

Page 17

Spatial Data Quality

Description:
Different ways a traditional cartographic land base can be taken are
discussed in this paper. It also reports the continuing popularity of
electronic sensors and photographic systems because they offer
superior image resolution and predictable systematic errors. It sheds
light on the fact that when two or more types of imagery are used for
a land base image, a significant variation in the positional accuracy
can occur. To overcome this, a method is described which would
help in determining the checkpoints by using error propagation
theory. The locations of these checkpoints are randomly generated
to obtain a non-biased evaluation of the overall image land base.
Reference: [Chong et al. 1993]

Title:
Living With Error in Geographic Data: Truth and Responsibility
Author: Chrisman, N.
Category: Conference Proceedings
Complete Citation:
Chrisman, N., Living With Error in Geographic Data: Truth and
Responsibility," Annual Symposium on Geographic Information
Systems in Natural Resources Management, Vancouver, British
Columbia, Canada, pp. 12-17, March 27-30, 1995.
Description:
This paper states that error in GIS cannot be avoided, but we can try
to minimize it within allowable limits. Chrisman explains that the user
must take responsibility for judging the components of data quality in
terms of their fitness for a particular use. He explains that the
measurement process in GIS involves choices between attribute and
spatial components. He illustrates this using a very good parable in
which the data is easily misinterpreted.
Reference: [Chrisman 1995]

Title:

Commentary: A Conceptual Framework for Integrating Remote


Sensing, GIS and Geography
Author: Dobson, J. E.
Category: Journal Paper
Complete Citation:
Dobson, J. E., Commentary: A Conceptual Framework for Integrating
Remote Sensing, GIS and Geography," Photogrammetric Engineering
and Remote Sensing, Vol. 59, No. 10, pp. 1491, October 1993.
Description:
The authors discuss the integration GIS, remote sensing and
geography data, and talk about the technical issues that arise when

Page 18

Spatial Data Quality

such an event it is carried out. The paper describes the elements


that are affected due to the integration, and includes factors such as
the cultural aspect, temporal change, and spatial interaction.
Reference: [Dobson 1993]

Title:

Cartographic Quality Control: No longer Optional for Todays GIS


Programs
Author: Donohoo, M. S.
Category: Conference Proceedings
Complete Citation:
Donohoo, M. S., Cartographic Quality Control: No Longer Optional
for Todays GIS Programs," Proceedings of AM/FM Conference XIII,
pp. 78 87, April 1990.
Description:
This report emphasizes that cartographic quality control cannot be
considered an option in the creation of a GIS. A highly skilled
cartographic editor should be assigned whose sole responsibility is
executing a strategic plan to ensure that accuracy, completeness,
consistency and aesthetics are monitored continuously throughout
GIS development. Successful quality control programs encompass
editing aerial photography; gathering quality control materials, editing
compiled information, checking aesthetics, ensuring that map sheets
match, editing contours, reviewing corrected data, generating client
review plots, and reviewing and submitting the final GIS products.
Tools for quality control includes check plots, photographic
enlargements, existing source documents, score sheets, data layer
validation programs, and quality control process. More and more
municipalities, utility companies, and other organizations only select
GIS contractors with proven quality control programs.
Reference: [Donohoo 1990]

Title:
Structural Reasoning for Spatial Database Accuracy Assessment
Author: Elmes, G. and Cai, G.
Category: Conference Proceedings
Complete Citation:
Elmes, G. and Cai, G., Structural Reasoning for Spatial Database
Accuracy Assessment," International Symposium on Spatial Accuracy
of Natural Resource Databases, pp. 141, May 16-20, 1994.
Description:
Estimation of uncertainty is a product of GIS information according to
these authors. A three-phase error handling process is proposed,

Page 19

Spatial Data Quality

compromised of error structure learning, priority scheduling, and


detailed modeling. Pair-wise comparison of error generating paths is
used here to determine priorities for detailed modeling.
Reference: [Elmes & Cai 1994]

Title:
An Overview of FIPS 173, the Spatial Data Transfer Standard
Author: Fegeas, R.; Cascio, J.; and Lazar, R.
Category: Conference Proceedings
Complete Citation:
Fegeas, R.; Cascio, J.; and Lazar, R., An Overview of FIPS 173, the
Spatial Data Transfer Standard," Proceedings of National Conference
'Challenge for the 1990s,' Canadian Institute of Surveying and
Mapping, pp. 381 390, Feb 27-Mar 3, 1992.
Description:
Following nine years of development, the Spatial Data Transfer
Standard (STDS) was approved on July 29,1992 as FIPS Publication
173. The SDTS consists of three parts. Part one is concerned with
logical specifications required for spatial data transfer and has three
main components: a conceptual model of spatial data, data quality
report specifications, and detailed logical transfer format
specifications for SDTS data sets. Part two provides a model for the
definition of real world spatial features, attributes, and attributes
values, and includes a standard but working and expandable list with
definitions. Part three specifies the byte-level format implementation
of the logical specifications in SDTS Part 1 using ISO/ANSI 8211
(FIPS 123), a general data exchange standard.
Reference: [Fegeas et. al. 1992]

Title:
Models of Uncertainty in Spatial Data
Author: Fisher, P. F.
Category: Book Chapter
Complete Citation:
Fisher, P. F., Geographic Information Systems Principles and
Technical Issues," Vol. 1, "Models of Uncertainty in Spatial Data,"
John Wiley & Sons, Chapter 13, pp. 191-205, 1999.
Description:
This chapter talks primarily about uncertainty in the spatial data in
terms of accuracy. It documents error, vagueness and ambiguity to
define uncertainty. The author endeavors to make the picture clear
by giving illustrations of different classes in which errors may arise. It
tries to show ways to control uncertainty and to distinguish between
vagueness and errors. It concludes that spatial data inherently

Page 20

Spatial Data Quality

contains uncertainty, and that data must be used carefully to minimize


it.
Reference: [Fisher 1999]

Title:
Embedding Quality into Countrywide Data Conversion
Author: Garza, R. J. and Foresman, T.
Category: Conference Proceedings
Complete Citation:
Garza, R. J. and Foresman, T., Embedding Quality into Countrywide
Data Conversion," GIS/LIS Conference Proceedings, pp. 130, 1991.
Description:
A standard development methodology of quality planning and
implementation for a local government GIS network is introduced
here. The basis for this methodology is Dr. W. Edwards Demings
philosophy for the improvement of quality, productivity, and
competitive position. Demings components of quality control are
described as they relate to the conversion of the Clark County parcel
layer. The Clark County QA value system is illustrated as a tool for
the improvement and future enhancement of the parcel layer. This
metadata component also serves to promote awareness of reliability
issues and varying quality for layers which are not of homogenous
origin.
Reference: [Garza & Foresman 1991]

Title:
Quality Control for GIS Conversion Projects
Author: Godden, R.
Category: Conference Proceedings
Complete Citation:
Godden, R., Quality Control for GIS Conversion Projects,"
Proceedings of ASPRS/ACSM, Volume 1, Remote Sensing and
Photogrammetry, Baltimore, Maryland, pp. 674, April 22-25, 1996.
Description:
A high quality, reliable and comprehensive database is important for
any successful implementation of GIS technology. A great deal of
effort and expertise is required to create superior, large-scale
databases. To achieve that, most companies look to the GIS
mapping and conversion industry for assistance.
Without an
organized conversion management/quality control program, the end
user runs significant risk of failure.
The purpose of this paper is to identify the main points of a
successful GIS project for end users. The key elements:

Page 21

Spatial Data Quality

Design: The importance of the physical database design,


including characteristics, relationship to applications, and
establishment of item definitions.
Specification: For detailed specification one should require
accuracy standards for spatial and attribute data as well as for the
conversion process. One should also consider obtaining a
Procedures Manual.
Schedule: How to develop a realistic project schedule based upon
the rate at which deliverables can be produced and reviewed. Hard
copy and digital techniques should be used.
Resources: A thorough QC program requires significant
personnel, time, and equipment resources.
Pilot Projects: Designing and managing a successful Pilot. The
purpose includes goals, selecting the best area, assessing results,
and finalizing the Procedures Manual.
Production: Monitoring quality throughout the production phase
requires consistency, documentation, tracking deliverables and
source materials, and well understood acceptance/rejection criteria.
This work addresses those persons responsible for planning and
overseeing the data conversion process.
Reference: [Godden 1996]

Title:

The Lineage of Data in Land and Geographic Information Systems


(LIS/GIS)
Author: Grady, R.
Category: Journal Paper
Complete Citation:
Grady, R., The Lineage of Data in Land and Geographic Information
Systems (LIS/GIS)," Journal of the Urban and Regional Information
Systems Association, Vol. 2, Fall 1990.
Description:
The importance of recording and tracking information about sources
and processing steps for GIS data is emphasized as a part of a data
quality report. The author cites traditional problems with accurate
recording and reporting of lineage information, and makes arguments
(both technical and institutional) for developing better standards and
procedures for managing lineage information. The article deals with
some practical considerations in establishing better data and
temporal aspects of this issue.
Reference: [Grady 1990]

Page 22

Spatial Data Quality

Title:

Investigating US Geological Survey Needs for the Management of


Temporal GIS Data
Author: Greve, C. W., Kelmelis, J. A., Gegeas, R., Guptill, S. C. and Mouat,
N.
Category: Journal Paper
Complete Citation:
Greve et al Greve, C. W., Kelmelis, J. A., Gegeas, R., Guptill, S. C.
and Mouat, N., Investigating US Geological Survey Needs for the
Management of Temporal GIS Data," Photogrammetric Engineering
and Remote Sensing, Vol. 59, No. 10, pp. 1503, October 1993.
Description:
This paper emphasizes the need to manage temporal information in
the National Digital Cartographic Database. It suggests obtaining
updates to the digital database on a feature basis, rather than
implementing the traditional method of revising the entire map sheet.
It talks about the importance of the time tag that needs to be recorded
along with the feature. It also discusses the different times - logical or
event time, physical or base time. Finally it provides a preliminary
assessment of US geological survey needs for temporal GIS data.
Reference: [Greve et. al. 1993]

Title:
Propagation of Error in Spatial Modelling with GIS
Author: Heuvelink, G. B. M.
Category: Book Chapter
Complete Citation:
Heuvelink, G. B. M., Geographic Information Systems Principles
and Technical Issues," Vol. 1, "Propagation of Error in Spatial
Modelling with GIS," John Wiley & Sons, Chapter 14, pp. 207-217,
1999.
Description:
This chapter describes the development, application and
implementation of error propagation techniques for quantitative
spatial data. It discusses the different stages where errors can
develop including: the level of data acquisition from the field through
classification, generalization and interpretation. It also reviews
different techniques (Taylor series approximation, Monte Carlo
Simulation etc.) to explain the propagation of error in different
phases.
Reference: [Heuvelink 1999]

Title:
Author:

Trends in Next Generation Electronic Survey Data Collection


Hintz, R., Roy, K. and Wahl J.

Page 23

Spatial Data Quality

Category: Conference Proceedings


Complete Citation:
Hintz, R., Roy, K. and Wahl J., Trends in Next Generation Electronic
Survey Data Collection ," Proceedings of ASPRS/ACSM, Volume 1,
"Remote Sensing and Photogrammetry," Baltimore, Maryland, pp.
155, April 22-25, 1996.
Description:
This paper discusses past and present generations of data. It goes
on to describe the value of computer software in data collection
through numerical analysis and attribute information such as location
of ground points, stationing and offset positioning, and point features.
The paper ends with its example of an ideal data collector.
Reference: [Hintz et. al. 1996]

Title:

New Tools for Handling Spatial Data Quality: Moving from Academic
Concepts to Practical Reality
Author: Hunter, G. J.
Category: Journal Paper
Complete Citation:
Hunter, G. J., New Tools for Handling Spatial Data Quality: Moving
from Academic Concepts to Practical Reality," URISA Journal, Vol.
11, No. 2, pp. 25-34, Summer 1999.
Description:
The author reports the availability of tools developed by himself and
his colleagues for implementation by users of spatial data. Examples
include:
A tracking of feature coordinate edits and their reporting in visual data
quality statements.
Testing and reporting the positional accuracy of linear features of
unknown lineage.
Simulating uncertainty in products derived from Digital Elevation Models.
Incorporating uncertainty modeling in vector, point, line, and polygon files.
Reporting data quality information at different levels of database
structure.
Reference: [Hunter 1999]

Title:
The Need for a Better Understanding of Spatial Databases
Author: Hunter and Williamson
Category: Conference Proceedings
Complete Citation:
Hunter and Williamson, The need for a Better Understanding of
Spatial Databases," URISA proceedings, Annual conference of the

Page 24

Spatial Data Quality

Urban Regional Information Systems Association pp. 121-128,


August 12-16, 1990.
Description:
This paper discusses the meaning of words like quality and
accuracy in the field of GIS, and states the implications of the
spatial data transfer standard in terms of lineage, positional accuracy,
attribute accuracy, consistency, and completeness. The paper ends
with examples justifying the importance of data quality standards in
GIS.
Reference: [Hunter & Williamson 1990]

Title:
Data Structures for Data Integration
Author: Kuehlthau, S. W. and Herring, J. R.
Category: Conference Proceedings
Complete Citation:
Kuehlthau, S. W. and Herring, J. R., Data Structures for Data
Integration," Proceedings of National Conference GIS for the 1990s,
Canadian Institute of Surveying and Mapping, pp. 73-86, Mar 5-8,
1990.
Description:
This paper describes the data structures required to integrate various
types, accuracies, and scales of data in order to maintain internal
consistency and consistency between data types.
Reference: [Kuehlthau 1990]

Title:

Data Quality Reporting Methods for Digital Geographical Products at


Statistics Canada
Author: Lundin, B., Yan, J., and Parker, J-P.
Category: Conference Proceedings
Complete Citation:
Lundin, B., Yan, J., and Parker, J-P., Data Quality Reporting
Methods for Digital Geographical Products at Statistics Canada,"
Proceedings of National Conference GIS for the 1990s, Canadian
Institute of Surveying and Mapping, pp. 236-251, Feb 27 - Mar 3,
1989.
Description:
A positional accuracy standard, called the Circular Map Accuracy
Standard, and the U.S. NCDS have been adopted.
Reference: [Lundin et. al. 1989]

Page 25

Spatial Data Quality

Title:

Development of Mapping Grade Global Positioning System Data


Collection System and Documentation Standards in North Carolina
Author: Matson, K., Thompson, G., Shaffer, K., Campbell, R. and Clapp, L.
Category: Conference Proceedings
Complete Citation:
Matson, K., Thompson, G., Shaffer, K., Campbell, R. and Clapp, L.,
Development of Mapping Grade Global Positioning System Data
Collection System and Documentation Standards in North Carolina
Proceedings of ASPRS/ACSM, Volume 1, Remote Sensing and
Photogrammetry, Baltimore, Maryland, pp. 122, April22-25, 1996.
Description:
This report stresses the importance of GPS technology in pinpointing
a location on earth. It points out that if GPS is to be used effectively
to collect data for a wide number of users, standards are important for
data collection, post-processing, and documentation. It also
discusses the GPS Data Collection and Documentation Standards
that North Carolina has developed to increase its data collection
accuracy which, in turn, contributes valuable information to the multiuser North Carolina Corporate Geographic Database.
Reference: [Matson et. al. 1996]

Title:
Data Requirements for Route Guidance
Author: Newcomb, M., Medan, J. and Smartt, B.
Category: Conference Proceedings
Complete Citation:
Newcomb, M., Medan, J. and Smartt, B., Data Requirements for
Route Guidance," GIS-T 93 Geographic Information Systems for
Transportation Symposium, Albuquerque, New Mexico, pp.209,
March 29-31, 1993.
Description:
The importance of route guidance for creating a system that can
intelligently route traffic is discussed in this paper. The paper shows
that the shortest route produced by computer software is not always
the most efficient path to a database containing all the necessary
attributes. The paper goes on to identify the data requirements for
route guidance and describes their effects upon routing algorithms. It
focuses on the components of an accurate road network and tries to
make the reliable route guidance a reality.
Reference: [Newcomb et. al. 1993]

Title:

Integration of Qualitative and Quantitative Information for Spatial


Query

Page 26

Spatial Data Quality

Author: Ng, C. and Shi, W.


Category: Conference Proceedings
Complete Citation:
Ng, C. and Shi, W., Integration of Qualitative and Quantitative
Information for Spatial Query," Proceedings of ASPRS/ACSM,
Volume 2, Annual Convention and Exposition of Technical Papers,
Seattle, Washington, pp. 135, April 7-10, 1993.
Description:
This paper tries to integrate both qualitative and quantitative spatial
data to build a bridge between natural language query and
geographic spactial reasoning. The authors argue that employing
integration will result in the development of a more effective and
realistic decision support tool for GIS. Proximity, the qualitative and
quantitative nature of spatial data, the existence of multi-context
nature, and travel time are also discussed.
Reference: [Ng & Shi 1993]

Title:
Digital Quality Control for Manual Digitizing Operations
Author: Ngan, S.
Category: Conference Proceedings
Complete Citation:
Ngan, S., Digital Quality Control for Manual Digitizing Operations,"
Proceedings of Ninth Annual Symposium on Geographic Information
Systems, Vancouver, British Columbia, Canada, pp. 739, March 2730, 1995.
Description:
This paper attempts to address the issues of digital cartographic data
accuracy, and explores the implementation of a data input system for
the control of errors that may be introduced by the manual capture of
utility data. It states that current manual efforts are slow and error
prone, particularly in the areas of positional and attribute inaccuracy,
logical inconsistency and incompleteness. The paper shows the
database catalog system which can be proved as an effective
mechanism for the control of errors in manual digitizing operations.
Reference: [Ngan 1995]

Title:
Quality Systems for Spatial Data
Author: Ostman, A.
Category: Conference Proceedings
Complete Citation:

Page 27

Spatial Data Quality

Ostman, A., Quality Systems for Spatial Data," Proceedings of the


Second Joint European Conference & Exhibition, Part Vol. 1, pp. 268276, March 27-29, 1996.
Description:
The quality of the product may be defined as its fitness for use." A
product may be a set of data, a method, or a combination of both.
One goal for a quality system for spatial data is to provide the tools
an end user requires to evaluate the reliability of the achieved result.
Several data quality standardization proposals have been made
during the last few years. The goal is to define easy-to-understand
and implement quality components, common for many different types
of spatial databases.
Due to the generality of the approaches, the quality descriptors
proposed here only provide simple answers to simple questions. For
complex questions where several different data sets are used in
complex analysis other approaches have to be made. Quality
systems on distributed and harmonized uncertainties are also
proposed. It is assumed that a major portion of uncertainties can be
expressed as probabilities. To address this, Monte Carlo simulators
are proposed as a foundation when error propagation studies are
needed. Other general quality services may also be required in the
future.
Reference: [Ostman 1996]

Title:

Visualization of Spatial Data Quality for the Decision-Maker: A Data


Quality Filter
Author: Paradis J., and Beard, K.
Category: Journal Paper
Complete Citation:
Paradis J., and Beard, K., Visualization of Spatial Data Quality for
the Decision-Maker: A Data Quality Filter," URISA Journal, Vol. 6, No.
2, pp. 25-34, March 1994.
Description:
This paper defines a data-quality filter that efficiently organizes and
communicates data quality with the decision-maker. The filter relates
the data quality information directly to the visualization of data,
providing an implicit yet precise portrayal of the datas fitness to use.
To apply the filter the user defines a set of quality requirements with
respect to accuracy, resolution, consistency, and lineage.
Reference: [Paradis & Beard 1994]

Page 28

Spatial Data Quality

Title:
Error and Accuracy in Spatial Data Allocation
Author: Peng, Z. and Dueker, K.
Category: Conference Proceedings
Complete Citation:
Peng, Z. and Dueker, K., Error and Accuracy in Spatial Data
Allocation GIS/LIS Proceedings, Minneapolis Convention pp. 592603, November 2-4, 1993.
Description:
This paper describes spatial data allocation in GIS and its application
in spatial data integration. It also describes various methods of spatial
data allocation compares the errors associated with different spatial
data allocation methods. Finally, it discusses factors affecting errors,
and develops an index of population density distribution, which is an
important factor affecting accuracy.
Reference: [Peng & Dueker 1993]

Title:

Design Requirements for Location as a Foundation for Transportation


Information Systems
Author: Ries, T
Category: Conference Proceedings
Complete Citation:
Ries, T., Design Requirements for Location as a Foundation for
Transportation Information Systems," GIS-T 93 Geographic
Information Systems for Transportation Symposium, Albuquerque,
New Mexico, pp.48, March 29-31, 1993.
Description:
The Wisconsin Department of Transportation (WiDOT) Division of
Highways, conducted an analysis of an information strategy plan
called Location Control Management (LCM). It concluded that
location can be divided into three categories: geodetic, geographical
and linear. The paper describes the design requirements and
proposed WiDOT solutions for the LCM linear level. It suggests that
the GIS-T community should consider these requirements when
developing the GIS-T standards.
Reference: [Ries 1993]

Title:
Integrated Photogeographic Databases
Author: Stefandis, A. and Agouris, P.
Category: Conference Proceedings
Complete Citation:
Stefandis, A. and Agouris, P., Integrated Photogeographic
Databases," Proceedings of ASPRS/ACSM, Volume 1, Remote

Page 29

Spatial Data Quality

Sensing and Photogrammetry, Baltimore, Maryland, pp. 32, April 2225, 1996.
Description:
This paper addresses the role of digital photogrammetry within the
current trends towards integrated Photogeographic databases,
consisting of photos and maps in digital format combined with
relevant information in raster or vector format. State-of-the art digital
photogrammetric research issues are discussed and focus on
automatic orientations, aerotraingulation, and man-made object
extraction are presented. Current forms of research activities in
terms of accuracy, efficiency, and productivity are also covered.
Reference: [Stefandis & Agouris 1996]

Title:
Accuracy of Spatial Data Used in Geographic Information Systems
Author: Thapa, K. and Bossler, J.
Category: Journal paper
Complete Citation:
Thapa, K. and Bossler, J., Accuracy of spatial data Used in
Geographic Information Systems," Photogrammetric Engineering and
Remote Sensing, American Society for Photogrammetry and Remote
Sensing, Vol. 58, No. 6, pp. 835-841, June 1992.
Description:
The authors first discuss different types of phases of GIS which
consist of collection, management, display and analysis of spatial
data. They also comment that data quality and accuracy are different
for different applications, then talk about different types of errors
encountered in the primary and secondary methods of data
collection. Different standards and specifications used in the primary
methods of data collection are also explained. Finally, a comparison
between primary and secondary methods of data collection is made.
Reference: [Thapa & Bossler 1992]

Title:
Data Quality Parameters
Author: Veregin, H.
Category: Book Chapter
Complete Citation:
Veregin, H., Geographic Information Systems Principles and
Technical Issues," Vol. 1, "Data Quality Parameters," John Wiley &
Sons, Chapter 12, pp. 177-189, 1999.
Description:
The chapter starts with answering questions such as What is data
quality? What are its components? And, how do you define its

Page 30

Spatial Data Quality

components? It also discusses the components of data accuracy,


precision, consistency, and completeness. The treatment of quality
components in data standards and the implications of cartographic
bias in geospatial data models are briefly addressed. It ends with a
discussion of the ways in which institutional values are embedded in
geospatial databases and the ways that data quality documentation
can help to articulate these values.
Reference: [Veregin 1999]

Title:
Standardization: Issues and -----------?
Author: Wellar, B.
Category: Conference Proceedings
Complete Citation:
Wellar, B., Standardization: Issues and -----------? Proceedings of
the Urban and Regional Information Systems Association
Conference, pp. 429-444, 1972.
Description:
This paper discusses important philosophical and practical tenets of
standardization within the context of statistical data generated and
used by government organizations. The authors present some
arguments illustrating why data standards are important and how they
can increase efficiency in the administration of government programs.
Institutional obstacles working against the adoption of data standards
are also discussed. Standard issues are presented within the context
of five phases of system development: specification, acquisition,
storage-retrieval-manipulation, dissemination, and applications.
Reference: [Wellar 1972]

Title:
Spatial Metadata and GIS for Decision Support
Author: Wong, D. W. S. and Wu, C. V.
Category: Conference Proceedings
Complete Citation:
Wong, D. W. S. and Wu, C. V., Spatial Metadata and GIS for
Decision Support," Proceedings of the Twenty-ninth Hawaii
International Conference on System Sciences (HICSS 29), Vol. 3,
pp. 557-566, March 1996.
Description:
This paper argues that current GIS spatial data quality standards are
not adequate to document the spatial variation in the data quality of
spatial data over a geographical area, which can be regarded as
spatial metadata. Spatial metadata should be derived and reported
to help users of spatial data make intelligent spatial decisions or

Page 31

Spatial Data Quality

policy formulations. This paper proposes that GIS develop logical


tools to assess certain types of error in spatial databases because it
is widely used to gather, manipulate, analyze, and display spatial
data. A framework is proposed to derive several types of data quality
information using GIS. These types of quality information include
positional accuracy, completeness, attribute accuracy, and to some
extent logical consistency. It emphasizes that not all types of spatial
data can be derived from GIS.
Reference: [Wong & Wu 1996]

Title:
Computation with Imprecise Geospatial Data
Author: Worboys, M.
Category: Journal Paper
Complete Citation:
Worboys, M., Computation with Imprecise Geospatial Data,"
Computers, Environment and Urban Systems, pp. 85-106, March
1998.
Description:
Imprecision in spatial data arises from a granularity or resolution at
which observations of phenomena are made, and from the limitations
imposed by computational representations, processing and
presentational media. Precision is an important component of spatial
data quality and a key to appropriate integration of collections of data
sets. Previous work of the author provides a theoretical foundation
for imprecision of spatial data resulting from finite granularities, and
gives the beginnings of an approach to reasoning with such data
using methods similar to rough set theory. This paper further
develops the theory and extends the work to a model that includes
both spatial and semantic components. Notions such as observation,
schema, the frame of discernment, and vagueness are examined and
formalized.
Reference: [Worboys 1998]

Title:
Spatial Data Quality and its Evaluation
Author: Wu, C. V. and Buttenfield, B.
Category: Journal Paper
Complete Citation:
Wu, C. V. and Buttenfield, B., Spatial Data Quality and its
Evaluation," Vol. 18, pp. 153-165, 1994.
Description:
This paper reviews recent concepts of data quality assessment and
presents a model for data quality evaluation. It argues that data

Page 32

Spatial Data Quality

quality may be acquired in a static manner by quantitative or


qualitative testing, or in an operational state by tracking down what
processing steps have been taken.
Data quality evaluation
information is done on the user decision regarding datas fitness for
use. It breaks down the data evaluation process into four steps, with
each step given specific tasks.
Reference: [Wu & Buttenfield 1994]

Title:
Temporal GIS Potentials and Challenges
Author: Zhao, F.
Category: Conference Proceedings
Complete Citation:
Zhao, F., Temporal GIS Potentials and Challenges," Proceedings
of GIS-T, Greensboro, NC, pp. 155, 1997.
Description:
This paper discusses the applications and challenges of temporal
GIS. It says that GIS analyses must be performed taking into
consideration the fixed time point. Longitudinal analysis takes time
and cannot be easily accomplished. If GIS takes the spatial and the
temporary nature of data into consideration, it will greatly expand
current GIS applications and allow new information to be obtained or
derived.
Reference: [Zhao 1997]

Page 33

You might also like