Sensors 18 00586
Sensors 18 00586
Article
Using Unmanned Aerial Vehicles in Postfire
Vegetation Survey Campaigns through Large and
Heterogeneous Areas: Opportunities and Challenges
José Manuel Fernández-Guisuraga 1, * ID
, Enoc Sanz-Ablanedo 2 ID
, Susana Suárez-Seoane 1 ID
Abstract: This study evaluated the opportunities and challenges of using drones to obtain
multispectral orthomosaics at ultra-high resolution that could be useful for monitoring large and
heterogeneous burned areas. We conducted a survey using an octocopter equipped with a Parrot
SEQUOIA multispectral camera in a 3000 ha framework located within the perimeter of a megafire in
Spain. We assessed the quality of both the camera raw imagery and the multispectral orthomosaic
obtained, as well as the required processing capability. Additionally, we compared the spatial
information provided by the drone orthomosaic at ultra-high spatial resolution with another image
provided by the WorldView-2 satellite at high spatial resolution. The drone raw imagery presented
some anomalies, such as horizontal banding noise and non-homogeneous radiometry. Camera
locations showed a lack of synchrony of the single frequency GPS receiver. The georeferencing process
based on ground control points achieved an error lower than 30 cm in X-Y and lower than 55 cm in Z.
The drone orthomosaic provided more information in terms of spatial variability in heterogeneous
burned areas in comparison with the WorldView-2 satellite imagery. The drone orthomosaic could
constitute a viable alternative for the evaluation of post-fire vegetation regeneration in large and
heterogeneous burned areas.
1. Introduction
Natural hazards, such as wildfires, constitute a serious global concern that is expected to increase
in the future [1] mainly due to global warming predictions and changes in land use [2,3]. In particular,
the increasing severity and recurrence of large forest fires in Mediterranean Basin ecosystems [4] can
lead to severe long-term land degradation, including desertification [5,6]. Thus, post-fire monitoring
of these systems through different tools should be a priority for management purposes [7].
Advances in geospatial technologies have led to an increase in the utilization of remote sensing
techniques [3], which represent a major opportunity for conducting post-fire surveys in large and
heterogeneous burned ecosystems [8]. High spatial resolution satellite imagery, such as that provided
by Deimos-2, GeoEye-2, QuickBird or WorldView-2 on-board sensors, among others, have been used
to assess post-fire regeneration in terms of fractional vegetation cover [8], species richness [9] or the
basal area of tree species [10]. Nevertheless, satellite imagery shows certain weaknesses that could
limit its applicability in the post-fire monitoring of highly heterogeneous and dynamic areas. First,
the revisit periods of satellite platforms cannot be user-controlled for short-term time series monitoring
of plant communities with a fast regeneration rate, such as shrublands, after the occurrence of a
disturbance [11]. Second, satellite imagery may be seriously affected by cloud cover and its projected
shadows [12]. Third, even though the spatial resolution of multispectral satellite imagery has been
improved, resolutions below one meter have not been achieved, which could become a problem when
monitoring certain biophysical properties in spatially heterogeneous ecosystems [11]. For their part,
sensors aboard manned platforms such as aircrafts, can also be used to conduct post-fire surveys on
demand, but regular monitoring is constrained because of the high economic costs [13].
The use of lightweight unmanned aerial vehicles (UAVs) usually implies lower economic costs
than other remote sensing techniques when surveying relatively small areas [11,13–16] and their low
flight speed and flight altitude enables ultra-high spatial resolution (better than 20 cm) imagery [11]
to be taken. In burned areas, vegetation recovery is not homogeneous due to the spatial variation in
fire regime, pre-fire plant community composition and environmental characteristics [17]. Therefore,
the use of ultra-high spatial resolution imagery would allow for conducting studies at the population
level [18] to assess the effectiveness of post-fire management actions such as seedlings plantation
strategies or seedling recruitment. Moreover, UAVs are flexible in terms of attaching different kinds of
sensors (e.g., RGB, multispectral or LiDAR), also allowing the operator to schedule the exact flight
time to gather data over target areas [19]. Nevertheless, UAV imagery may be difficult to manage
because of its ultra-high spatial resolution [20] and the platform does not allow for a simultaneous
image acquisition of large areas [19] and has a limited flight time [12].
Several research projects have used UAV on-board sensors for wildlife population monitoring [21–26],
estimation of forest structural parameters [27–32], individual tree or species mapping [33–36], estimation
of post-fire vegetation recovery from digital surface models [37], estimation of fire severity [18,38] and
forest fire detection [39]. However, to our knowledge, the operational and processing challenges in
the generation of multispectral mosaics derived from rotor-based UAVs imagery that allow very large
burned areas to be monitored have not been assessed yet. In addition, it would be necessary to know
the pros and cons of this tool on large burned surfaces in relation to fine-grained satellite imagery.
In this context, the comparison between the spatial and spectral information provided by UAVs and
satellites on heterogeneous surfaces would be essential to determine their suitability for representing
fine scale ground variability. Some authors have compared the spatial information provided by
ultra-high spatial resolution imagery captured by UAVs and high spatial resolution satellite imagery,
for instance that provided by WorldView-2 satellite on agricultural systems such as vineyards [12] or
crops [40], but not to our knowledge in heterogeneous burned areas.
Most common cameras employed in UAV surveys are digital RGB cameras [11,40] or digital
cameras where one of the visible bands has been adapted for NIR imagery acquisition [16,32]. Also,
multispectral cameras, such as Tetracam ADC Lite [41] or MicaSense RedEdge [42], have been chosen
to perform aerial surveys. For its part, the Parrot SEQUOIA (Parrot SA, Paris, France) is a novel and
affordable multispectral camera released on the market in early 2016, whose imagery quality has not
been evaluated in scientific literature.
The main objective of this study is to evaluate the feasibility of using a rotor-based UAV with an
on-board multispectral sensor (Parrot SEQUOIA) to obtain a multispectral orthomosaic at ultra-high
spatial resolution, which could be useful for forestry management purposes in a heterogeneous
and large burned area (3000 ha). Specifically, we intend to: (1) evaluate the quality of the raw
imagery dataset captured with the Parrot SEQUOIA multispectral camera; (2) discuss the challenges
encountered when dealing with the volume of data at ultra-high spatial resolution generated in the
UAV survey carried out in a large area, and assess both the required processing capability and the
quality of the obtained multispectral mosaic; and (3) compare the spatial information provided by
the UAV ultra-high resolution multispectral mosaic with high spatial resolution satellite imagery
(WorldView-2) in a heterogeneous burned landscape.
Sensors 2018, 18, 586 3 of 17
Sensors 2018, 18, x FOR PEER REVIEW 3 of 18
2. Materials and
2. Materials Methods
and Methods
2.1.2.1.
Study Area
Study Area
The study
The studyarea
area(Figure
(Figure1)1)isisaa3000
3000ha
haframework locatedin
framework located inthe
thecentral
centralsection
sectionofof a megafire
a megafire of of
about 10,000 ha which occurred in a Pinus pinaster stand in Sierra del Teleno (León Province,
about 10,000 ha which occurred in a Pinus pinaster stand in Sierra del Teleno (León Province, northwest
Spain) in August
northwest Spain)2012. The survey
in August 2012. The framework was representative
survey framework of the heterogeneity
was representative of theoffire
of the heterogeneity
regime within
the fire regimethewithin
perimeter.
the perimeter.
Figure 1. UAV survey framework within the megafire perimeter of Sierra del Teleno.
Figure 1. UAV survey framework within the megafire perimeter of Sierra del Teleno.
The study area is dominated by an Appalachian relief with prominent quartzite crests, wide
valleys study
The area is dominated
with moderate slopes on thebyupper
an Appalachian
two thirds ofrelief witharea,
the study prominent quartzite terraces
and sedimentary crests, wide
on
valleys with moderate slopes on the upper two thirds of the study area, and sedimentary
the lower third. The mean annual temperature in the area is 10 °C, with an average rainfall terraces
of 650on
themm.
lower third. The mean annual temperature in the area is 10 ◦ C, with an average rainfall of 650 mm.
The understory plant community after the occurrence of the megafire is composed by species
The understory
such plant
as Halimium community
alyssoides, after the
Pterospartum occurrence
tridentatum andofErica
the australis
megafire is composed
[43], with a greatby species such
regeneration
Halimium
as of alyssoides,
Pinus pinaster Pterospartum tridentatum and Erica australis [43], with a great regeneration of
seedlings.
Pinus pinaster seedlings.
2.2. UAV Platform and Multispectral Camera
2.2. UAV Platform and Multispectral Camera
A FV8 octocopter (ATyges, Málaga, Spain, Figure 2) was chosen to perform the aerial survey of
A FV8
a large octocopter
burned surface(ATyges, Málaga,
of 3000 ha. Spain,
This UAV Figure 2) was entirely
is manufactured chosen to perform
from carbonthe aerial
fiber and survey
titaniumof a
large
andburned surface
it weighs of 3000
3.5 kg, withha. This UAV payload
a maximum is manufactured
mass of entirely fromeight
1.5 kg. The carbon fiber and
brushless titanium
motors and
(AXI-
it weighs 3.5 kg, with a maximum payload mass of 1.5 kg. The eight brushless motors (AXI-ATYGES
Sensors 2018, 18, 586 4 of 17
2814/22 260
Sensors W18,
2018, with
x FORa PEER
maximum
REVIEWefficiency of 85%) are powered by two lithium-ion polymer4 of batteries
18
(rated capacity and voltage of 8200 mAh and 14.8 V, respectively). The UAV has a cruising speed of
ATYGES 2814/22 260 W with a maximum efficiency of 85%) are powered by two lithium-ion polymer
7 m·s−1 (10 m·s−1 max), with an ascent/descent rate of 5.4 km·h−1 (10.8 km·h−1 max). The maximum
batteries (rated capacity and voltage of 8200 mAh and 14.8 V, respectively). The UAV has a cruising
interference-free flight range is 3 km, with a flight duration of 10–25 min depending on the payload
speed of 7 m·s−1 (10 m·s−1 max), with an ascent/descent rate of 5.4 km·h−1 (10.8 km·h−1 max). The
and weather conditions.
maximum interference-free The maximum flight
flight range is 3 height
km, withis 500 m above
a flight ground
duration layer
of 10–25 (AGL).
min The platform
depending on
is remotely controlled
the payload by a conditions.
and weather 12-channelThe MZ-24
maximumHoTT radio
flight transmitter
height (Graupner,
is 500 m above groundKirchheim
layer (AGL).unter
Teck,The
Germany)
platformoperating
is remotelyat 2.4 GHz. The
controlled by UAV is equipped
a 12-channel MZ-24 with
HoTTa micro
radio FPV camera(Graupner,
transmitter with real-time
Kirchheim unter Teck, Germany) operating at 2.4 GHz. The UAV is equipped
video transmission at 5.8 GHz to a Flysight monitor. The core component of the UAV electronics with a micro FPV is an
camera with real-time video transmission at 5.8 GHz to a Flysight monitor.
ATmega 1284P flight controller (Microchip Technology Inc., Chandler, AZ, USA) with an integratedThe core component of
the UAV electronics is an ATmega 1284P flight controller (Microchip Technology
pressure sensor, gyroscopes and accelerometers. The navigation control board is based on an Atmel Inc., Chandler, AZ,
USA) with an integrated pressure sensor, gyroscopes and accelerometers. The navigation control
ARM9 microcontroller and it has a MicroSD card socket for waypoint data storage. The GPS module
board is based on an Atmel ARM9 microcontroller and it has a MicroSD card socket for waypoint
with integrated antenna is a LEA-6 (u-blox, Thalwil, Switzerland). This system allows for autonomous,
data storage. The GPS module with integrated antenna is a LEA-6 (u-blox, Thalwil, Switzerland). This
semi-autonomous
system allows for andautonomous,
manual takeoffs, landings and
semi-autonomous andflight.
manual takeoffs, landings and flight.
The aerial survey campaign was conducted for 100 h between June and July 2016. All flights (383)
were performed within a 6-h window around the solar zenith to maintain relatively constant lighting
Sensors 2018, 18, 586 5 of 17
conditions. Though small variations in environmental conditions were rectified with the irradiance
sensor, severe wind or cloud cover were avoided.
Mikrokopter Tools software was used to plan flights, which allowed the operator to generate an
automatic flight route with waypoints depending on the camera’s field of view (FOV), the chosen
forward and side overlap between images and the required GSD [45]. A digital elevation model (DEM)
was used to keep the same distance AGL in all flights tracks owing to the large difference in altitude
(410 m) in the study framework. Flight tracks were uploaded in the UAV for each complete day.
The flight height was fixed at 120 m AGL, providing an average ground resolution of 14.8 cm·pixel−1
given the specific camera characteristics. Each flight had an effective duration of 5–6 min (without
including the takeoff and landing), with an average speed of 10 m s−1 . Battery change time and time
needed to reach each takeoff site were not computed. However, both time lapses were included in the
total flight time of 100 h. The camera trigger interval was set to a platform advance distance of 22.4 m in
order to achieve an 80% forward image overlap at the fixed flight altitude. The waypoints route planned
allowed an 80% side image overlap. The image overlap between adjacent flights was at least a flight
line. The quality of the raw imagery dataset acquired during the UAV survey was evaluated to search for
potentially undesired anomalies, such as: (1) horizontal banding noise (HBN) [46]; (2) non-homogeneous
radiometry and issues related with hot-spot or opposition effect [47] or (3) blurring effects [48].
f.383 sp.43
Figure
Figure 3. Processingworkflow
3. Processing workflow of
of the
the UAV
UAVimagery
imagerywith
withPix4Dmapper Pro.Pro.
Pix4Dmapper
2.5. WorldView-2 High Spatial Resolution Satellite Imagery and Image Comparison Statistical Analysis
2.5. WorldView-2 High Spatial Resolution Satellite Imagery and Image Comparison Statistical Analysis
A WorldView-2 image acquired on 23 June 2016 for the study framework was used to compare
A
theWorldView-2 imageprovided
spatial information acquiredbyonthe 23UAV
Juneplatform
2016 forwiththe study framework
high resolution was imagery
satellite used to in compare
a
the spatial information provided by the UAV platform with high resolution
heterogeneous burned landscape. The spatial resolution of the multispectral sensor on-board satellite imagery in
a heterogeneous
WorldView-2 burned landscape.
satellite at nadir is 1.84 The spatial
m, but resolution
the image of the by
was delivered multispectral sensor on-board
DigitalGlobe resampled to
2 m. This sensor
WorldView-2 hasat
satellite eight bands
nadir in them,
is 1.84 visible
but and
the NIR
image region
wasofdelivered
the spectrumby[55]: coastal blueresampled
DigitalGlobe (400–
450This
to 2 m. nm),sensor
blue (450–510
has eightnm),bands
green in(510–580 nm), and
the visible yellow NIR(585–625
regionnm), redspectrum
of the (630–690 nm),
[55]:red edge blue
coastal
(400–450 nm), blue (450–510 nm), green (510–580 nm), yellow (585–625 nm), red (630–690 nm), reda edge
(705–745 nm), NIR1 (770–895 nm) and NIR2 (860–1040 nm). The raw image was orthorectified with
DEM (accuracy better than 20 cm in terms of RMSEZ) and GCPs extracted from PNOA orthophotos.
(705–745 nm), NIR1 (770–895 nm) and NIR2 (860–1040 nm). The raw image was orthorectified with a
The image atmospheric correction was conducted by the Fast Line-of-sight Atmospheric Analysis of
DEM (accuracy better than 20 cm in terms of RMSEZ ) and GCPs extracted from PNOA orthophotos.
Spectral Hypercubes (FLAASH) algorithm [56]. The HyPARE algorithm implemented in ENVI 5.3
The image
[57] wasatmospheric correction
used to geometrically was
align conducted
the by the Fast
UAV multispectral Line-of-sight
orthomosaic and theAtmospheric
WorldView-2Analysis
image of
Spectral Hypercubes (FLAASH) algorithm
achieving a UAV subpixel RMSE (<20 cm). [56]. The HyPARE algorithm implemented in ENVI 5.3 [57]
was usedThe to image
geometrically
comparison align
wasthe UAV multispectral
performed on the basis of theorthomosaic and the
reflectance values andWorldView-2
the Normalized image
achieving a UAV
Difference subpixelIndex
Vegetation RMSE (<20 cm).
(NDVI) of the UAV multispectral orthomosaic and the WorldView-2
image.
The UAVcomparison
image multispectralwas mosaic at original
performed onresolution
the basis(20of cm) was resampled
the reflectance to a GSD
values and theof 1 Normalized
m (half
of WorldView-2 spatial resolution) and 2 m (WorldView-2 spatial resolution)
Difference Vegetation Index (NDVI) of the UAV multispectral orthomosaic and the WorldView-2 with a block average
function for the input pixels within a set of non-overlapping windows with the required size (5 × 5
image. UAV multispectral mosaic at original resolution (20 cm) was resampled to a GSD of 1 m
and 10 × 10 pixels). The function was computed with ArcGIS 10.3.1. Pearson bivariate correlations
(half of WorldView-2 spatial resolution) and 2 m (WorldView-2 spatial resolution) with a block average
between the UAV multispectral mosaic (GSD of 20 cm, 1 m and 2 m) and WorldView-2 image (GSD
function for the input pixels within a set of non-overlapping windows with the required size (5 × 5
of 2 m) were calculated on each comparable band to assess the spatial information provided by each
and 10 ×
sensor10inpixels). The function
our survey framework. wasTocomputed
determinewith ArcGIS 10.3.1.
the reflectance Pearson
variability bivariate
between correlations
sensors, we
between the UAV
computed the multispectral
variance in themosaic (GSD
reflectance of 20incm,
values each1m bandandof2 the
m) UAV
and WorldView-2
images (nativeimage spatial(GSD
of 2 m) were calculated
resolution and resampled)on each comparableimage.
and WorldView-2 band For to assess
the more the spatial information
heterogeneous provided
surface within the by
each survey
sensorframework,
in our surveywhich covers 1.5 ha,
framework. Toa determine
basic statisticthe
package was calculated
reflectance on the
variability UAV (atsensors,
between native we
resolution
computed the and 2 m) and
variance in WorldView-2
the reflectance NDVI mapsintoeach
values compareband theofpotentiality
the UAV of these products
images (native in spatial
post-fire vegetation monitoring.
resolution and resampled) and WorldView-2 image. For the more heterogeneous surface within the
survey framework, which covers 1.5 ha, a basic statistic package was calculated on the UAV (at native
resolution and 2 m) and WorldView-2 NDVI maps to compare the potentiality of these products in
post-fire vegetation monitoring.
Sensors 2018, 18, 586 7 of 17
Sensors 2018, 18, x FOR PEER REVIEW 7 of 18
3. Results
3. Results
Sensors 2018, 18, x FOR PEER REVIEW 7 of 18
3.1.3.1.
Raw Imagery
Raw Dataset
Imagery Quality
Dataset Quality
3. Results
From
From 383383
UAVUAV flights,
flights, we we acquired
acquired 45,875
45,875 images
images forfor each
each band,
band, which
which made
made a total
a total of 183,500
of 183,500 raw
raw images
images that represented
that represented approximately
approximately 430 of
430 GB GBinformation.
of information.TheThe normalizedUAV
normalized UAVimages
images had
had a
3.1. Raw Imagery Dataset Quality
a balanced
balanced contrast.
contrast. However,
However, the redthechannel
red channel
showed showed some saturation
some saturation over reflective
over highly highly reflective
surfaces
surfacesFrom
on 383
this UAV flights,
wavelength, we acquired
such as 45,875
forest images
tracks in ourfor each
study
on this wavelength, such as forest tracks in our study area (Figure 4). band,
area which
(Figure made
4). a total of 183,500
raw images that represented approximately 430 GB of information. The normalized UAV images had
a balanced contrast. However, the red channel showed some saturation over highly reflective
surfaces on this wavelength, such as forest tracks in our study area (Figure 4).
Figure 4. Example normalized UAV images from the dataset corresponding to green (A); red (B); red
Figure 4. Example normalized UAV images from the dataset corresponding to green (A); red (B); red
edge (C) and NIR (D) bands, as well as the image histograms.
edge Figure
(C) and4. NIR (D) normalized
Example bands, as well
UAVasimages
the image histograms.
from the dataset corresponding to green (A); red (B); red
Aedge (C) and NIR (D) bands, as well as the image histograms.
slightly horizontal banding noise (HBN) was observed within the four channels of the camera,
A slightly
especially inhorizontal
the green banding noise (HBN)
channel (Figure 5). Thewas observed
banding within
effect the four
was more channels
noticeable at of
thethe
topcamera,
and
A slightly horizontal banding noise (HBN) was observed within the four channels of the camera,
especially in the green channel (Figure 5). The banding effect was more noticeable at
bottom of the image, where differences in the digital levels of alternate rows representing the samethe top and
especially in the green channel (Figure 5). The banding effect was more noticeable at the top and
bottom
objectofwere
the image,
higher where
than differences in the digital levels of alternate rows representing the same
10%.
bottom of the image, where differences in the digital levels of alternate rows representing the same
objectobject
werewere
higher than 10%.
higher than 10%.
Another undesired effect observed across the imagery was non-homogeneous radiometry across
theAnother undesired
image related witheffect observed Reflectance
Bidirectional across the imagery was non-homogeneous
Distribution Function (BRDF) [47]. radiometry across athe
In particular,
image related with Bidirectional Reflectance Distribution Function (BRDF) [47]. In
specific area of the imagery had systematically higher reflectance values than the remainingparticular, a specific
areas
area of the imagery had systematically higher reflectance values than the remaining areas
(Figure 6). This radiometric anomaly effect is commonly denominated hot-spot or opposition effect(Figure 6). This
radiometric anomaly effect is commonly denominated hot-spot or opposition effect [58,59] and
[58,59] and it appears as a consequence of the camera and sun position alignment [60]. For its part, it appears
as athe
consequence of the
image dataset didcamera and sun
not exhibit position
blurring alignment
effects that are[60]. For associated
usually its part, thewith
image dataset
camera did not
shaking
exhibit
[15]. blurring effects that are usually associated with camera shaking [15].
Figure 6. Hot-spot effect on green (A); red (B); red edge (C) and NIR (D) bands. Note that the upper
Figure 6. Hot-spot effect on green (A); red (B); red edge (C) and NIR (D) bands. Note that the upper
left corner (framed in red) has higher reflectance values regardless of the terrain characteristics.
left corner (framed in red) has higher reflectance values regardless of the terrain characteristics.
Figure 7. NDVI
Figure 7. NDVI mosaic
mosaic ofof the
the UAV survey framework.
UAV survey framework. Blank
Blank areas
areas are
are those
those masked
masked due
due to the
to the
malfunction of the irradiance sensor.
malfunction of the irradiance sensor.
Initial georeferencing was achieved by introducing the UAV’s GPS positions taken at each
Initial georeferencing was achieved by introducing the UAV’s GPS positions taken at each camera
camera shot in the bundle-block adjustment process within Pix4D workflow. The precision reported
shot in the bundle-block adjustment process within Pix4D workflow. The precision reported by Pix4D,
by Pix4D, calculated as the root mean square error (RMSE), was between 1.5–3 m in X-Y and between
calculated as the root mean square error (RMSE), was between 1.5–3 m in X-Y and between 2–4 m in Z.
2–4 m in Z.
Sensors 2018, 18, 586 10 of 17
The final georeferencing of the subprojects achieved by using ground control points (GCPs)
extracted from PNOA orthophotos achieved an RMSE lower than 30 cm in X-Y and lower than 55 cm
in Z. Horizontal and vertical accuracy was improved from initial georeferencing at least 80% and 73%
respectively, after providing evenly distributed GCPs through the UAV survey framework.
3.3. Comparison of the Spatial Information Provided by UAV and WorldView-2 Imagery
Higher rPearson values were obtained when the UAV mosaic resolution approached the resolution
of the WorldView-2 image (2 m) (Table 1) for each band of the spectrum. The correlation between the
two remote sensing platforms for each resolution was stronger for the visible region of the spectrum.
Table 1. Pearson correlation results between native and resampled UAV multispectral mosaics and
WorldView-2 multispectral image.
The largest variance in the reflectance values of each band was found for the UAV orthomosaic at
20 cm spatial resolution (Table 2). The variance of the UAV orthomosaic at 2 m of spatial resolution
was similar to that of the WorldView-2 image.
Table 2. Variance in reflectance values computed for each band of the original and resampled UAV
multispectral mosaics and the WorldView-2 multispectral image.
The comparison between UAV and WorldView-2 NDVI maps derived from the imagery datasets
at the original resolution of each sensor, corresponding to a heterogeneous surface of 1.5 ha within the
survey framework, revealed greater variability in the UAV pixel values (Figure 8A,B). The horizontal
structure of the vegetation observed in this area (Figure 9A) can be identified in the UAV mosaic
(Figure 9B), but not in the WorldView-2 image (Figure 9C). The UAV NDVI map resampled to 2 m
presented similar variability to the WorldView-2 image (Figure 8B,C).
Sensors 2018, 18, 586 11 of 17
Sensors 2018, 18, x FOR PEER REVIEW 11 of 18
Sensors 2018, 18, x FOR PEER REVIEW 11 of 18
Figure 8. Histogram and statistics of the UAV (A) and WorldView-2 (B) NDVI map at native
Figure 8. Histogram and statistics of the UAV (A) and WorldView-2 (B) NDVI map at native resolutions,
resolutions,
Figure and of the and
8. Histogram UAVstatistics
NDVI map of resampled
the UAV to
(A)2 m (C),WorldView-2
and corresponding(B)
to aNDVI
1.5 hamap
portion
at within
native
and of the
pine UAV NDVI
plantation map resampled to 2 m (C), corresponding to a 1.5 ha portion within pine
resolutions, andarea.
of the UAV NDVI map resampled to 2 m (C), corresponding to a 1.5 ha portion within
plantation area.
pine plantation area.
Figure 9. PNOA orthophoto (A), UAV (B) and WorldView-2 (C) NDVI maps at original resolutions,
and UAV
Figure NDVI map
9. PNOA resampled
orthophoto (A),toUAV
2 m (D), corresponding
(B) and to a(C)
WorldView-2 heterogeneous
NDVI maps surface of 1.5
at original ha within
resolutions,
Figure
the PNOA
9.survey orthophoto (A), UAV (B) and WorldView-2 (C) NDVI maps at original resolutions,
framework.
and UAV NDVI map resampled to 2 m (D), corresponding to a heterogeneous surface of 1.5 ha within
and UAV NDVI map resampled to 2 m (D), corresponding to a heterogeneous surface of 1.5 ha within
the survey framework.
the survey framework.
Sensors 2018, 18, 586 12 of 17
4. Discussion
This study evaluated the strengths and limitations of using a rotor-based UAV equipped with a
novel multispectral camera (Parrot SEQUOIA) to conduct a field survey of a large (3000 ha) and
heterogeneous burned surface. Our results indicate that the ultra-high spatial resolution UAV
multispectral orthomosaic represents a valuable tool for post-fire management applications at fine
spatial scales [18]. However, due to the ultra-high spatial resolution of the data and the large size of
the surveyed area, data processing was highly time consuming.
Multispectral cameras onboard UAVs provide countless opportunities for remote sensing
applications, but the technological limitations of these sensors [46] would require evaluation of
the quality of the captured raw imagery data, particularly in novel sensors. In this study, we found that
the raw imagery captured by the Parrot SEQUOIA multispectral camera presented some undesired
radiometric anomalies. In the red channel we observed sensor saturation over highly reflective
surfaces. This effect was not induced by radiometric down sampling from 10 to 8-bit performed by
Pix4D during processing because it was present both in raw (10-bit) and in normalized (8-bit) images.
The horizontal banding noise observed within the four channels of the camera is a common artifact of
CMOS (complementary metal oxide semiconductor) rolling shutter sensors [46]. However, the Parrot
SEQUOIA uses a global shutter system and this effect should not be significant in this multispectral
sensor. To our knowledge, this camera has not been used in previous scientific studies and, therefore,
this issue has not been reported so far. The issues related with Bidirectional Reflectance Distribution
Function (BRDF) effect are magnified in sensors with a wide Field of View [61,62] such as the Parrot
SEQUOIA. For its part, the hot-spot or opposition effect was more apparent at shorter wavelengths, as
also highlighted by [47]. Some corrections to mask this effect have been proposed [59] that must be
made individually for each image taking into account the time and position of the image acquisition,
image orientation and solar positioning (azimuth and elevation), following some photogrammetric
steps. Thus, the correction of this radiometric anomaly as well as the BRDF effect is very challenging
and time consuming, becoming an unapproachable task when dealing with large imagery datasets [58].
The absence of a blurring effect in our dataset could be explained by the increased flight stability that
the rotor-based UAVs offer over fixed-wing UAV platforms, also exhibiting fewer vibrations [13,29].
Moreover, the Parrot SEQUOIA camera was attached to the platform with a rubber damper to minimize
vibrations, and the camera acquired imagery with the focal length set to infinity and fast shutter
speed [15], preventing the occurrence of this effect. USB-disconnections between the camera and the
irradiance sensor could be associated to a poor connection. However, the disconnections did not imply
a major problem with the irradiance sensor, considering that it provided complete records for more
than 90% of the survey framework with varying atmospheric conditions between adjacent flights, even
performed on different days since the data acquisition from a rotor-based UAV platform could not be
carried out in a single run over large areas due to restrictions in the flight range [12].
In relation to delivery times of on-demand imagery of commercial satellites and the usual times
needed to implement post-fire management strategies within large burned areas [63], the length of
the flight campaign (17 days) and the laboratory processing tasks (20 days) required a reasonable
time. The computational demand of the project was very high due to the large amount of raw image
data collected (183,500 raw images) and its ultra-high spatial resolution. The size of this dataset
caused management difficulties in the laboratory in terms of data storage, backup and processing
capability. This circumstance has already been reported by [20], data transfer between research
teams being restricted by physical storage units or some processing options such as cloud computing.
This computational demand may limit the execution of this type of projects to users who have access
to high-end computers to process raw imagery. However, recent advances in computational capacity
would allow a large-scale implementation of this type of workflow [64]. Other remote sensing products
with reduced processing requirements such as on-demand satellite imagery offer a resolution from
pan-sharpening techniques that is increasingly closer to what can be obtained with multispectral
sensors on board UAVs. However, according to [65–67], the use of pan-sharpening techniques presents
Sensors 2018, 18, 586 13 of 17
several problems such as the appearance of spatial impurities or radiometric distortions in the merged
product. This type of anomaly could represent a serious problem for providing the highest radiometric
and spatial accuracy for fine scale applications. On the other hand, we consider that for this type
of study, a UAV is more versatile than other types of remote sensing platforms, allowing flights to
be carried out in the immediate post-fire situation given the provided control of the revisit time [18].
Another possible alternative to this highly demanding processing framework could be the performance
of flights in small non-adjacent surfaces within the study area to reduce the campaign effort, but it
would not be feasible to obtain a multispectral product that allows extrapolation of, for example,
recovery models to other areas within the study area where the flights have not been carried out.
The initial georeferencing precision (RMSEX,Y between 1.5–3 m and RMSEZ between 2–4 m) is not an
optimum result considering that some authors, such as [51], have established as low accuracy an X-Y
error higher than two times the GSD and a Z error higher than three times the GSD. Single frequency
GPS receivers, such as the one used in the platform, which features a light antenna and chip power
limitations, typically show important drifts throughout time. This is particularly important in our
case since every subproject included flights carried out at different times or even on different days
due to the large size of the surveyed area. Current research on the installation of dual frequency GPS
onboard UAV platforms [68] would allow for direct georeferencing the generated geomatic products
without the need of GCPs [15]. The geospatial accuracy of the final georeferencing achieved by using
GCPs is a good result (RMSEX,Y < 30 cm and RMSEZ < 55 cm) considering the great extension of the
UAV survey framework and taking into account that some studies reported a decrease in accuracy
with large survey areas [64]. Other studies, such as that conducted by [11], obtained similar geospatial
accuracy, but in our case, the error is closer to the lower limit that approximately matches the pixel
size [69]. This accuracy was highly influenced by the even distribution of the GCPs through the UAV
survey framework [70,71].
Within the comparison framework of the spatial information provided by UAV and WorldView-2
imagery, the higher correlations obtained between UAV orthomosaic resampled to match WorldView-2
image resolution, confirm that in the first successional stages of the vegetation on heterogeneous burned
areas, the highest spatial resolution UAV mosaic (20 cm) does not provide redundant information [12]
in relation to the satellite image. In this case, the ground variability scale associated with small
vegetation patches, is larger than the coarser pixel sizes. Moreover, the stronger correlation between
the UAV and WorldView-2 imagery found in the visible region of the spectrum was probably due to the
similar relative spectral response in that region for the two sensors [44,55]. The NDVI map comparison
between the UAV and WorldView-2 imagery conducted on a heterogeneous surface within the UAV
survey framework, revealed again that coarser resolution satellite imagery cannot represent the spatial
variability and patterns of areas characterized by very small vegetation patches [12]. The larger variance
in reflectance values for each band of the highest spatial resolution UAV orthomosaic indicates that this
product may be able to capture fine-scale ground patterns because of the greater spatial information
provided by the dataset, improving the interpretation of landscape features. Some authors such as [18]
stated that at this spatial scale, variations in sun azimuth and elevation will create variable shadow
features throughout the day. This factor may introduce reflectance variability, and therefore, distort
the calculation of spectral indices in ultra- high spatial resolution images. This effect in small targets is
less significant in satellite imagery given its pixel size. However, within the NDVI map comparison
framework, the sun azimuth and elevation of the UAV flight approximately matches the ones in
WorldView-2 capture and the variability in reflectance values of both sensors was approximately the
same as for the entire study area.
5. Conclusions
(1) The raw imagery acquired by the Parrot SEQUOIA multispectral camera presented some
undesirable anomalies such as horizontal banding noise and non-homogeneous radiometry
Sensors 2018, 18, 586 14 of 17
across the image. Moreover, the irradiance sensor disconnections induced some radiometric
anomalies across a small area of the multispectral mosaic that had to be discarded.
(2) The 16-bit imagery acquired on the UAV flights of the 3000 ha survey framework represents a
large volume of data before processing it into a multispectral orthomosaic due to its ultra-high
spatial resolution and the large size of the surveyed area. Nevertheless, this spatial resolution,
which cannot be achieved with satellite platforms, could be crucial for developing spatial products
to be used in post-fire management decision-making.
(3) Data processing was very labor-intensive, taking about 320 h to obtain the final multispectral
orthomosaic. Due to the large imagery dataset generated on a UAV survey of a large area,
the dataset processing must be subdivided regardless of the available processing capability.
The obtained geospatial accuracy of the UAV multispectral orthomosaic was high (RMSEX,Y <
30 cm and RMSEZ < 55 cm) regarding the large extension of the surveyed area and the spatial
resolution of the dataset.
(4) The spatial information provided by the ultra-high spatial resolution UAV multispectral
orthomosaic was not redundant in these large and heterogeneous burned areas in comparison
with high spatial resolution satellite imagery such as that provided by WorldView-2. The UAV
orthomosaic could therefore improve the analysis and interpretation of fine-scale ground patterns.
Acknowledgments: This research was financially supported by the Spanish Ministry of Economy and
Competitiveness, and the European Regional Development Fund (ERDF), within the framework of the GESFIRE
project (grant number AGL2013-48189-C2-1-R); and by the Regional Government of Castilla y León within the
framework of the FIRECYL project (grant number LE033U14). José Manuel Fernández-Guisuraga was supported
by the European Social Fund and Youth Employment Initiative through the Spanish Ministry of Economy and
Competitiveness (grant number PEJ2014-A-47268) and by a Predoctoral Fellowship of the Spanish Ministry of
Education, Culture and Sport (FPU16/03070).
Author Contributions: S.S.-S. and L.C. conceived and designed the experiments; J.M.F.-G., E.S.-A., S.S.-S. and
L.C. performed the experiments; J.M.F.-G. and E.S.-A. analyzed the data; E.S.-A. contributed materials/analysis
tools; J.M.F.-G., E.S.-A., S.S.-S. and L.C. wrote the paper.
Conflicts of Interest: The authors declare no conflict of interest.
References
1. Seneviratne, S.I.; Nicholls, N.; Easterling, D.; Goodess, C.M.; Kanae, S.; Kossin, J.; Luo, Y.; Marengo, J.;
McInnes, K.; Rahimi, M.; et al. Changes in climate extremes and their impacts on the natural physical
environment. In Managing the Risks of Extreme Events and Disasters to Advance Climate Change Adaptation,
1st ed.; Field, C.B., Barros, V., Stocker, T.F., Qin, D., Dokken, D.J., Ebi, K.L., Mastrandrea, M.D., Mach, K.J.,
Plattner, G.K., Allen, S.K., et al., Eds.; Cambridge University Press: Cambridge, UK; New York, NY, USA,
2012; pp. 109–230. ISBN 978-11-0-702506-6.
2. Quintano, C.; Fernández-Manso, A.; Calvo, L.; Marcos, E.; Valbuena, L. Land surface temperature as potential
indicator of burn severity in forest Mediterranean ecosystems. Int. J. Appl. Earth Obs. 2015, 36, 1–12. [CrossRef]
3. Poursanidis, D.; Chrysoulakis, N. Remote Sensing, natural hazards and the contribution of ESA Sentinels
missions. Remote Sens. Appl. Soc. Environ. 2017, 6, 25–38. [CrossRef]
4. Álvarez, A.; Gracia, M.; Vayreda, J.; Retana, J. Patterns of fuel types and crown fire potential in Pinus
halepensis forest in the Western Mediterranean Basin. For. Ecol. Manag. 2012, 270, 282–290. [CrossRef]
5. Vallejo, R.; Alloza, J.A. The restoration of burned lands: The case of eastern Spain. In Large Forest Fires, 1st ed.;
Moreno, J.M., Ed.; Backhuys Publishers: Leiden, The Netherlands, 1998; pp. 91–108.
6. Tessler, N.; Wittenberg, L.; Greenbaum, N. Vegetation cover and species richness after recurrent forest fires
in the Eastern Mediterranean ecosystem of Mount Carmel, Israel. Sci. Total Environ. 2016, 572, 1395–1402.
[CrossRef] [PubMed]
7. Ruíz-Gallardo, J.R.; Castaño, S.; Calera, A. Application of remote sensing and GIS to locate priority
intervention areas after wildland fires in Mediterranean systems: A case study from southeastern Spain.
Int. J. Wildland Fire 2004, 13, 241–252. [CrossRef]
8. Chu, T.; Guo, X.; Takeda, K. Remote sensing approach to detect post-fire vegetation regrowth in Siberian
boreal larch forest. Ecol. Indic. 2016, 62, 32–46. [CrossRef]
Sensors 2018, 18, 586 15 of 17
9. Viedma, O.; Torres, I.; Pérez, B.; Moreno, J.M. Modeling plant species richness using reflectance and texture
data derived from QuickBird in a recently burned area of Central Spain. Remote Sens. Environ. 2012, 119,
208–221. [CrossRef]
10. Jung, M.; Tautenhahn, S.; Wirth, C.; Kattge, J. Estimating Basal Area of Spruce and Fir in Post-fire Residual
Stands in Central Siberia Using Quickbird, Feature Selection, and Random Forests. Procedia Comput. Sci.
2013, 18, 2386–2395. [CrossRef]
11. Zhang, J.; Hu, J.; Lian, J.; Fan, Z.; Ouyang, X.; Ye, W. Seeing the forest from drones: Testing the potential of
lightweight drones as a tool for long-term forest monitoring. Biol. Conserv. 2016, 198, 60–69. [CrossRef]
12. Matese, A.; Toscano, P.; Di Gennaro, S.F.; Genesio, L.; Vaccari, F.P.; Primicerio, J.; Belli, C.; Zaldei, A.;
Bianconi, R.; Gioli, B. Intercomparison of UAV, Aircraft and Satellite Remote Sensing Platforms for Precision
Viticulture. Remote Sens. 2015, 7, 2971–2990. [CrossRef]
13. Anderson, K.; Gaston, K.J. Lightweight unmanned aerial vehicles will revolutionize spatial ecology.
Front. Ecol. Environ. 2013, 11, 138–146. [CrossRef]
14. Tang, L.; Shao, G. Drone Remote Sensing for Forestry Research and Practices. J. For. Res 2015, 26, 791–797.
[CrossRef]
15. Ribeiro-Gomes, K.; Hernandez-Lopez, D.; Ballesteros, R.; Moreno, M.A. Approximate georeferencing and
automatic blurred image detection to reduce the costs of UAV use in environmental and agricultural
applications. Biosyst. Eng. 2016, 151, 308–327. [CrossRef]
16. Zhou, J.; Pavek, M.J.; Shelton, S.C.; Holden, Z.J.; Sankaran, S. Aerial multispectral imaging for crop hail
damage assessment in potato. Comput. Electron. Agric. 2016, 127, 406–412. [CrossRef]
17. Beaty, R.M.; Taylor, A.H. Spatial and Temporal Variation of Fire Regimes in a Mixed Conifer Forest Landscape,
Southern Cascades, California, USA. J. Biogeogr. 2001, 28, 955–966. [CrossRef]
18. McKenna, P.; Erskine, P.D.; Lechner, A.M.; Phinn, S. Measuring fire severity using UAV imagery in semi-arid
central Queensland, Australia. Int. J. Remote Sens. 2017, 38, 4244–4264. [CrossRef]
19. Torresan, C.; Berton, A.; Carotenuto, F.; Di Gennaro, S.F.; Gioli, B.; Matese, A.; Miglietta, F.; Vagnoli, C.;
Zaldei, A.; Wallace, L. Forestry applications of UAVs in Europe: A review. Int. J. Remote Sens. 2016, 38, 1–21.
[CrossRef]
20. Hardin, P.J.; Jensen, R.R. Small-Scale Unmanned Aerial Vehicles in Environmental Remote Sensing:
Challenges and Opportunities. GISci. Remote Sens. 2011, 48, 99–111. [CrossRef]
21. Jones, G.P.; Pearlstine, L.G.; Percival, H.F. An assessment of small unmanned aerial vehicles for wildlife
research. Wildl. Soc. B 2006, 34, 750–758. [CrossRef]
22. Koski, W.R.; Allen, T.; Ireland, D.; Buck, G.; Smith, P.R.; Macrander, A.M.; Halick, M.A.; Rushing, C.;
Sliwa, D.J.; McDonald, T.L. Evaluation of an unmanned airborne system for monitoring marine mammals.
Aquat. Mamm. 2009, 35, 347–357. [CrossRef]
23. Israel, M. A UAV-based roe deer fawn detection system. In Proceedings of the International Archives of
the Photogrammetry, Remote Sensing and Spatial Information Sciences, Conference on Unmanned Aerial
Vehicle in Geomatics, Zurich, Switzerland, 14–16 September 2011.
24. Chabot, D.; Bird, D.M. Evaluation of an off-the-shelf Unmanned Aircraft System for surveying flocks of
geese. Waterbirds 2012, 35, 170–174. [CrossRef]
25. Sarda-Palomera, F.; Bota, G.; Viñolo, C.; Pallarés, O.; Sazatornil, V.; Brotons, L.; Gomáriz, S.; Sarda, F.
Fine-scale bird monitoring from light unmanned aircraft systems. IBIS 2012, 154, 177–183. [CrossRef]
26. Vermeulen, C.; Lejeune, P.; Lisein, J.; Sawadogo, P.; Bouche, P. Unmanned aerial survey of elephants.
PLoS ONE 2013, 8, e54700. [CrossRef] [PubMed]
27. Floris, A.; Clementel, F.; Colle, G.; Gubert, F.; Bertoldi, L.; De Lorenzi, G. Estimation of Wood Volume with
Photogrammetric Data Sensing from UAV on Small Surfaces: A Case Study in Trentino. In Proceedings of
the 16th ASITA National Conference, Vicenza, Italy, 6–9 November 2012.
28. Getzin, S.; Wiegand, K.; Schöning, I. Assessing biodiversity in forests using very high-resolution images and
unmanned aerial vehicles. Methods Ecol. Evol. 2012, 3, 397–404. [CrossRef]
29. Wallace, L.; Lucieer, A.; Watson, C.; Turner, D. Development of a UAV–LiDAR system with application to
forest inventory. Remote Sens. 2012, 4, 1519–1543. [CrossRef]
30. Dandois, J.P.; Ellis, E.C. High spatial resolution three-dimensional mapping of vegetation spectral dynamics
using computer vision. Remote Sens. Environ. 2013, 136, 259–276. [CrossRef]
Sensors 2018, 18, 586 16 of 17
31. Lisein, J.; Pierrot-Deseilligny, M.; Bonnet, S.; Lejeune, P. A Photogrammetric Workflow for the Creation
of a Forest Canopy Height Model from Small Unmanned Aerial System Imagery. Forests 2013, 4, 922–944.
[CrossRef]
32. Puliti, S.; Orka, H.O.; Gobakken, T.; Naesset, E. Inventory of small forest areas using an Unmanned Aerial
System. Remote Sens. 2015, 7, 9632–9654. [CrossRef]
33. Fritz, A.; Kattenborn, T.; Koch, B. UAV-based photogrammetric point clouds-tree stem mapping in open
stands in comparison to terrestrial laser scanner point clouds. In Proceedings of the UAV-g2013, Rostock,
Germany, 4–6 September 2013.
34. Gini, R.; Passoni, D.; Pinto, L.; Sona, G. Use of Unmanned Aerial Systems for Multispectral Survey and Tree
Classification: A Test in a Park Area of Northern Italy. Eur. J. Remote Sens. 2014, 47, 251–269. [CrossRef]
35. Jaakkola, A. Low-cost Mobile Laser Scanning and its Feasibility for Environmental Mapping. Ph.D. Dissertation,
Aalto University, Espoo, Finland, 2015.
36. Michez, A.; Piégay, H.; Lisein, J.; Claessens, H.; Lejeune, P. Classification of Riparian Forest Species
and Health Condition Using Multi-Temporal and Hyperspatial Imagery from Unmanned Aerial System.
Environ. Monit. Assess. 2016, 188, 1–19. [CrossRef] [PubMed]
37. Aicardi, I.; Garbarino, M.; Lingua, A.; Lingua, E.; Marzano, R.; Piras, M. Monitoring Post-Fire Forest Recovery
Using Multitemporal Digital Surface Models Generated from Different Platforms. In Proceedings of the
EARSeL Symposium, Bonn, Germany, 20–24 June 2016.
38. Fraser, R.H.; van der Sluijs, J.; Hall, R.J. Calibrating Satellite-Based Indices of Burn Severity from UAV-Derived
Metrics of a Burned Boreal Forest in NWT, Canada. Remote Sens. 2017, 9, 279. [CrossRef]
39. Cruz, H.; Eckert, M.; Meneses, J.; Martínez, J.F. Efficient Forest Fire Detection Index for Application in
Unmanned Aerial Systems (UASs). Sensors 2016, 16, 893. [CrossRef] [PubMed]
40. Pérez-Ortiz, M.; Peña, J.M.; Gutiérrez, P.A.; Torres-Sánchez, J.; Hervás-Martínez, C.; López-Granados, F.
Selecting patterns and features for between- and within- crop-row weed mapping using UAV-imagery.
Expert Syst. Appl. 2016, 47, 85–94. [CrossRef]
41. Misopolinos, L.; Zalidis, C.H.; Liakopoulos, V.; Stavridou, D.; Katsigiannis, P.; Alexandridis, T.K.; Zalidis, G.
Development of a UAV system for VNIR-TIR acquisitions in precision agriculture. In Proceedings of the
Third International Conference on Remote Sensing and Geoinformation of the Environment, Paphos, Cyprus,
16–19 March 2015.
42. Tian, J.; Wang, L.; Li, X.; Gong, H.; Shi, C.; Zhong, R.; Liu, X. Comparison of UAV and WorldView-2 imagery
for mapping leaf area index of mangrove forest. Int. J. Appl. Earth Obs. 2017, 61, 22–31. [CrossRef]
43. Calvo, L.; Santalla, S.; Valbuena, L.; Marcos, E.; Tárrega, R.; Luis-Calabuig, E. Post-fire natural regeneration
of a Pinus pinaster forest in NW Spain. Plant Ecol. 2008, 197, 81–90. [CrossRef]
44. Parrot. Available online: https://community.parrot.com/t5/Sequoia/bd-p/Sequoia (accessed on 3 June 2017).
45. Santesteban, L.G.; Di Gennaro, S.F.; Herrero-Langreo, A.; Miranda, C.; Royo, J.B.; Matese, A. High-resolution
UAV-based thermal imaging to estimate the instantaneous and seasonal variability of plant water status
within a vineyard. Agric. Water Manag. 2017, 183, 49–59. [CrossRef]
46. Kelcey, J.; Lucieer, A. Sensor Correction of a 6-Band Multispectral Imaging Sensor for UAV Remote Sensing.
Remote Sens. 2012, 4, 1462–1493. [CrossRef]
47. Burkart, A.; Aasen, H.; Alonso, L.; Menz, G.; Bareth, G.; Rascher, U. Angular Dependency of Hyperspectral
Measurements over Wheat Characterized by a Novel UAV Based Goniometer. Remote Sens. 2015, 7, 725–746.
[CrossRef]
48. Koik, B.T.; Ibrahim, H. A literature survey on blur detection algorithms for digital imaging. In Proceedings
of the 1st International Conference on Artificial Intelligence, Modelling and Simulation, Kota Kinabalu,
Malaysia, 3–5 December 2013.
49. Pix4D. Available online: https://pix4d.com/product/pix4dmapper-photogrammetry-software/ (accessed
on 8 December 2016).
50. McGlone, J.C. Manual of Photogrammetry, 6th ed.; American Society for Photogrammetry and Remote Sensing
(ASPRS): Bethesda, MD, USA, 2013; ISBN 978-15-7-083071-6.
51. Ruzgiené, B.; Berteška, T.; Gečyte, S.; Jakubauskienė, E.; Aksamitauskas, V.C. The surface modelling based
on UAV Photogrammetry and qualitative estimation. Measurement 2015, 73, 619–627. [CrossRef]
52. Zahawi, R.A.; Dandois, J.P.; Holl, K.D.; Nadwodny, D.; Reid, J.L.; Ellis, E.C. Using lightweight unmanned
aerial vehicles to monitor tropical forest recovery. Biol. Conserv. 2015, 186, 287–295. [CrossRef]
Sensors 2018, 18, 586 17 of 17
© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access
article distributed under the terms and conditions of the Creative Commons Attribution
(CC BY) license (http://creativecommons.org/licenses/by/4.0/).