0% found this document useful (0 votes)
32 views6 pages

bm3d CNN Crops

This document proposes modifying the BM3D filter and using a convolutional neural network (CNN) classifier to improve crop classification accuracy from SAR data. The BM3D filter is adapted to the properties of Sentinel-1 SAR images, including the spatially correlated nature of speckle noise. A CNN approach is investigated and compared to traditional pixel-based classification to reduce noise impact. Experimental results show the modified BM3D filter and CNN classifier lead to improved crop classification compared to other filters and classifiers.

Uploaded by

Shriyans Chavan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views6 pages

bm3d CNN Crops

This document proposes modifying the BM3D filter and using a convolutional neural network (CNN) classifier to improve crop classification accuracy from SAR data. The BM3D filter is adapted to the properties of Sentinel-1 SAR images, including the spatially correlated nature of speckle noise. A CNN approach is investigated and compared to traditional pixel-based classification to reduce noise impact. Experimental results show the modified BM3D filter and CNN classifier lead to improved crop classification compared to other filters and classifiers.

Uploaded by

Shriyans Chavan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

1071

Use of Modified BM3D Filter and CNN Classifier for


SAR Data to Improve Crop Classification Accuracy
Mykola Lavreniuk Andrii Shelestov Nataliia Kussul
Department of Space Information Department of Space Information Department of Space Information
Technologies and Systems Technologies and Systems Technologies and Systems
Space Research Institute NAS Ukraine Space Research Institute NAS Ukraine Space Research Institute NAS Ukraine
and SSA Ukraine and SSA Ukraine and SSA Ukraine
Kyiv, Ukraine Kyiv, Ukraine Kyiv, Ukraine
[email protected] [email protected] [email protected]
Oleksii Rubel Vladimir Lukin Karen Egiazarian
Department of Information and Department of Information and Signal Processing Laboratory
Communication Technologies Communication Technologies Tampere University
National Aerospace University “KhAI” National Aerospace University “KhAI” Tampere, Finland
Kharkiv, Ukraine Kharkiv, Ukraine [email protected]
[email protected] [email protected]

Abstract — Monitoring of agricultural regions is an channel RS data to solve different application tasks including
important task. Recent trends to solve it are based on applying crop classification [9, 10].
multi-temporal remote sensing in order to obtain reliable crop
classification maps. If a radar remote sensing is used, speckle However, SAR images are known to possess a specific
presence in the original data reduces a classification accuracy. drawback - a speckle coherent processing of backscattered
A negative impact of speckle can be reduced by image pre- signals [8, 11, 12], which negatively influences on
filtering procedure. Recent studies carried out for Sentinel-1 classification accuracy [8-10]. Then, it is desired to remove
imagery have shown that more efficient pre-filtering usually speckle as effectively as possible [12, 13] taking into
results in better classification. Thus, here we propose a consideration its characteristics depending on imaging mode
modification of block matching three-dimensional (BM3D) and other factors [14].
filter adapted to the properties of Sentinel-1 radar data. We
demonstrate that its use leads to improved classification of Note that there exist numerous despeckling techniques [8,
crops in agricultural region of Ukraine compared to earlier 15, 16] as well as other image filtering methods. For
methods which are based on other filters. Additionally, a instance, ESA SNAP toolbox [15] offers the following
convolutional neural network approach for crop mapping is general purpose and special despeckling filters: Boxcar,
investigated in this paper for decreasing noise impact and is Frost, Gamma-MAP, Intensity driven adaptive neighborhood
compared with a traditional pixel-based multi-layer (IDAN), Lee, Lee-Sigma, Refined Lee, Median [15-17]. The
perceptron. use of all these filters improves classification for real-life
data, but in different degree. The best available in ESA
Keywords— SAR, denoising, speckle removal, crop mapping, SNAP toolbox filter for crop classification purpose is the
Sentinel-1, CNN. refined Lee filter [9, 10]. The reason is that it is able to better
cope with the multiplicative nature of noise and well
I. INTRODUCTION preserve edges. Therefore, in this study, we consider the
refined Lee filter as one of the best filter openly available
Remote Sensing (RS) sensors and techniques of acquired
and compare our methods with it.
data processing are widely used for different applications [1].
Agriculture has become one of the applications for which RS Even better results have been provided by three
is extremely useful for, e.g., crop mapping and crop modifications of filters based on discrete cosine transform
monitoring, salinity, moisture, and biomass control on (DCT) [18, 19]. The reason behind this lies on the fact that
regular basis, water resource management, etc. [2-6]. As a these DCT-based filters are efficiently utilizing speckle
particular case, for crop classification and crop monitoring spatial correlation [9, 10] which is essential in Sentinel-1
purpose, different types of RS data, such as, optical and radar images. Slightly better results have been observed for
hyperspectral RS data, can be utilized. Clouds and shadows the DCT-based filter [19] that, similarly to [20], processes
can be often present over a region of interest which two-polarization images jointly.
diminishes the potential advantages of RS.
The paper [20], similarly to [21], exploits image self-
A negligible influence of clouds is an important similarity – the property which is inherent for different types
advantage of radar RS. Moreover, modern radar RS sensors of images [22-25]. The use of self-similarity allows
such as Sentinel-1 offer novel opportunities in agricultural preserving edges and details better and this can be important
monitoring dealing with free access to them, appropriate in multichannel image classification since most
spatial resolution (about 10 m) and regular observation of misclassifications are observed in the neighborhoods of
each terrain under interest [7]. Acquired synthetic-aperture edges and fine details.
radar (SAR) satellite images are of high quality and the
considered sensor carries out global coverage providing rich Thus, our goal is to propose a modification of block-
information of dual-polarization data [8] each twelve days matching three-dimensional (BM3D) filter [22] adapted to
and (for Europe territory the coverage is denser – revisit time multiplicative nature of the noise like in [21, 26], to multi-
is six days). Thus, Sentinel-1 offers multi-temporal two- look properties of the speckle [27] like in [26] and to spatial
correlation peculiarity of speckle like in [24, 25].

__________________________________________________________________________________________________________________
978-1-7281-3882-4/19/$31.00 ©2019 IEEE 2019 IEEE 2nd Ukraine Conference on Electrical and Computer Engineering
Lviv, Ukraine, July 2-6, 2019
1072

In other words, we try to incorporate recent advances in one can see, speckle is spatially correlated. Normalized
theory of radar and optical image non-local denoising and spectrum for other polarization is very similar. Spectrum
expect that this will lead to better crop classification using estimates for other considered images are similar, too. Thus,
multi-temporal Sentinel-1A/B images. we can state that normalized spectrum is of almost constant
shape and this simplifies our further design.
II. BRIEF REVIEW OF PROPERTIES OF SENTINEL
SAR IMAGES III. DESIGN OF BM3D MODIFICATION
A thorough analysis of properties of Sentinel SAR BM3D filter [22] is considered to be one of the best in
images can be found in [9, 10, 27]. Because of this, here we removing additive white Gaussian noise. Here we plan to
simply recall the main aspects. A typical initial assumption remove non-additive non-white and non-Gaussian noise.
concerning speckle is that it is practically pure multiplicative Thus, BM3D has to be properly modified to be efficient.
and non-Gaussian [8-12]. This assumption has been already
Recall that principle of BM3D operation is the
proven using special automatic means as, e.g. ENVI [9, 10,
following. For each image block of size 8x8 pixels a set of
27] for both VV (Vertical-Vertical) and VH (Vertical-
similar blocks (according to Euclidian distance) is found.
Horizontal) polarizations. Fig. 1 presents examples of image
They form a 3D data set that is decorrelated by Haar
fragments of size 512x512 pixels. Analysis has been
transform “vertically” and then fixed value hard threshold
performed for fragments (homogeneous areas) having
DCT denoising is applied. After this, inverse operations are
different mean values (one such area is shown by red frame).
performed and the obtained filtered values are aggregated.
The estimated relative variance σμ2 is about 0.05 for both
polarizations. Such estimates have been obtained by both To convert a multiplicative noise to an additive one, a
automatic blind estimation techniques [14, 27] and in standard operation of direct homomorphic (variance
interactive manner. stabilizing) transform [21, 26] of logarithmic type can be
Analysis of histograms in image homogeneous regions applied. It is described as I H  a  ln (I)/ ln (b) where I is a
using Gaussianity tests have demonstrated that speckle noisy image, IH is a transformed image, a = 8.9 and b = 1.2.
distribution is close to but not exactly Gaussian. This is in Then, additive noise standard deviation is
agreement with analysis results in [26] where it is shown  H  a    / ln (b) where   is a relative noise STD.
that, to be Gaussian, relative variance σμ2 of multi-look
speckle has to be smaller than 0.03. Since a speckle is almost Gaussian, its distribution after
direct homomorphic transform remains to be close to
Preliminary studies [9, 10, 27] have clearly shown that Gaussian. Its normalized DCT spectrum practically does not
speckle is spatially correlated. This correlation can be change as well (the corresponding verification has been
described in the different way. We prefer to use 2D DCT performed).
normalized spectrum because it will be further directly used
in image denoising. Such spectrum calculated for blocks of After the homomorphic transform, the filter [24, 25] is
size 8x8 pixels is most often used in despeckling [9, 19, 21, applied. Its difference compared to the basic version of
22, 24]. If indices of spatial frequencies BM3D consists in the following. First, Canberra distance is
( k  1,...,8; l  1,...,8 ) are small, they correspond to low used instead of Euclidian since the former performs better in
similar block search if a noise is spatially correlated. Second,
spatial frequencies. Let us take into account that DCT is the after Haar transform, a decorrelation at stage of DCT
orthogonal transform. Then, it has uniform spectrum for denoising hard thresholds are set frequency dependent as
white noise and non-uniform spatially correlated speckle
where the main power is concentrated in low frequencies.
An example of the normalized spectrum for VV- T (k , l )    D 0p.5 (k , l ),   
polarization Sentinel SAR image is presented in Fig. 2. As

Fig. 1. – An example of Sentinel SAR images of VV and VH polarizations

__________________________________________________________________________________________________________________
2019 IEEE 2nd Ukraine Conference on Electrical and Computer Engineering
Lviv, Ukraine, July 2-6, 2019
1073

where D p (k , l ) denotes normalized DCT spatial spectrum, k ensemble of neural networks, namely, multilayer
perceptrons (MLPs) [10], [28] – [30]. Second, we propose to
= 1..8, l = 1..8.  is a filter parameter for hard thresholding
utilize convolutional neural network (CNN) to deal with
usually set equal to 2.7. We have used other recommended speckle reduction on classification stage for crop mapping
settings of BM3D [22] (number of similar blocks, distance using SAR data without any filtering and using the proposed
thresholds, etc.). The use of frequency dependent thresholds 3D DCTF filter. CNN explores not only spectral but also
allows suppressing spatially correlated noise better. spatial features, in contrast to pixel-based methods, in
particular, MLP [31, 32]. The CNN architecture used in this
paper has been described in [33].
The obtained crop classification maps after applying
different filters are shown in Fig.4. In Table 1, we have
presented the comparison of user accuracy (UA), producer
accuracy (PA), overall accuracy (OA) and kappa coefficient
for all classes from the crop classification map using
different filters for SAR data in 2016. The overall accuracy
of the crop classification map without any filtering based on
MLP classifier is 82.6%. The refined Lee filter from SNAP
toolbox using MLP classifier provides a gain by +4.8% in
overall accuracy, but this is the lowest accuracy among all
investigated filters. The previously proposed filter 3D DCTF
using MLP classifier provides a classification accuracy equal
to 88.7%. The most accurate crop map using pixel-based
MLP classifier was obtained based on images pre-processed
by the proposed modified BM3D filter. Overall accuracy
Fig. 2. – Example of normalized DCT spectrum estimates for VV Sentinel produced by the modified BM3D filter is higher than the 3D
SAR image
DCTF filter by +0.2% and the refined Lee filter by +1.5%.
After denoising, inverse homomorphic transform is Results of utilizing CNN on SAR data without any filtering
applied and image mean level is corrected to its initial value. provides almost the same results as combination of MLP
As it is seen, polarization SAR images at this stage are classifier and the best preprocessing filter BM3D.
processed separately although we plan to design approaches Nevertheless, a combination of the best preprocessing filter
to joint processing. BM3D and automatic filtering procedure on classification
step (training using sliding window) using the CNN provides
The examples of denoising relied on estimated noise significant gain in terms of overall accuracy compared to
variance and noise block spectrum are shown in Fig. 3. Since MLP + data without filtering by +7.4%, to MLP + refined
we deal with real-life data, it is difficult to have quantitative Lee by +2.6%, MLP + BM3D by +1.1% and to CNN + data
criteria for comparison of filter’s efficiency. Therefore, we without filtering by +1%. It is important to emphasize that
propose to compare the quality of filtering on crop MLP + BM3D gains not only overall accuracy, but increases
classification map accuracy. The filter outputs are PA and UA for each class, excluding bare land (user and
represented in Fig. 4. Comparison of crop classification maps producer accuracies for this class is low and it is too hard to
obtained after different filters are applied can be done by a fair comparison) and CNN + BM3D outperform the standard
visual inspection of data shown in Fig. 5. classification without any filtering in terms of UA and PA
for all classes.
IV. RESULTS
Investigation of different filtering methods impact on V. CONCLUSIONS
crop classification accuracy has been conducted in the In this paper, multitemporal SAR images from Sentinel-
Southern part of Ukraine. We have used Level-1 1A satellite acquired in 2016 are employed to test
Interferometric Wide mode Ground Range Detected (IW- applicability of a new modification of BM3D filter in the
GRD) Sentinel-1 products in VV and VH polarizations. sense of improving crop classification accuracy and
Time series of ten Sentinel-1A images has been filtered using convolutional neural network architecture investigated as
different methods and further preprocessed using ESA SNAP additional step of noise reduction. In this paper, the proposed
toolbox (calibration, and terrain correction using SRTM 3Sec methodology is compared to the best freely availably refined
Digital Elevation Model (DEM) and conversion values to Lee filter from ESA SNAP toolbox and the previously
dB). The comparison of the refined Lee filter from SNAP published 3D DCTF filter. The combination of BM3D filter
toolbox, the 3D DCTF, our new proposed method and not and CNN classifier has produced the best result in terms of
filtered image for 07 July 2016 is shown in Fig. 4. It is not crop classification map overall accuracy. Overall accuracy
easy to evaluate quality of filtering visually (user’s opinions obtained with pixel-based MLP classifier and without any
can be subjective). Therefore, we have classified SAR data filter was 82.6%, the proposed BM3D filter gains by + 6.3%
preprocessed using different filtering methods using the same and the combination of BM3D filter and CNN classifier
in-situ data for training and for testing. Training set consists outperform all other approaches with the final 90%.
of 153 georeferenced polygons for nine classes and the test Moreover, the proposed methodology increases accuracies
set consists of 146 georeferenced polygons for nine classes. for each individual class without exclusion.
Two different types of classifiers have been utilized in
this experiment. First, classification was done using an

__________________________________________________________________________________________________________________
2019 IEEE 2nd Ukraine Conference on Electrical and Computer Engineering
Lviv, Ukraine, July 2-6, 2019
1074

A) B) C)
Fig. 3. – Noisy fragment of image (A), filtered fragments by 3D DCTF (B) and BM3D (C)

Fig. 4. – Example of using different filters applied to Sentinel-1A images: A) image without filtering; B) Refined Lee filter; C) 3D DCTF; D) BM3D.

Fig. 5. – Example of crop classification maps based on different filters for Sentinel-1A images: A) without filtering; B) Refined Lee filter; C) 3D DCTF; D)
BM3D; E) CNN + without filtering and F) CNN + BM3D

__________________________________________________________________________________________________________________
2019 IEEE 2nd Ukraine Conference on Electrical and Computer Engineering
Lviv, Ukraine, July 2-6, 2019
1075

TABLE I. COMPARISON OF USER ACCURACY (UA), PRODUCER ACCURACY (PA), OVERALL ACCURACY (OA) AND KAPPA COEFFICIENT FOR DIFFERENT
FILTERS FOR SAR DATA IN 2016

No filter Refined Lee 3D DCTF BM3D CNN + no filter CNN + BM3D


Class
UA, % PA, % UA, % PA, % UA, % PA, % UA, % PA, % UA, % PA, % UA, % PA, %
Artificial 4.6 42.1 6.8 45.4 8.4 65.3 4 71.8 46.2 38.9 91.4 14.8
Winter wheat 65.5 73.5 80.3 77.6 86.1 70.5 89.3 70.4 73.2 80.6 80.6 66.5
Maize 62.7 49.8 82.1 57.9 94.9 82.6 97 78 89.6 50.2 92.4 72.6
Sunflower 96.5 90.2 96.7 93.8 96.6 95.2 97.1 95.6 94.4 98 95 97.4
Soybeans 37 61.3 45.5 66.8 59.5 72.3 63.3 72.4 50.7 66.8 54.6 68.4
Forest 52.9 88.3 66.5 91.9 29.1 48.4 85.7 91.1 81.5 99.6 89.8 99.1
Grassland 38.5 83.6 46.3 92.8 48.6 91.1 44.8 92.3 67.5 47.6 77.3 78.9
Bare land 49.8 23.2 62.6 47.9 48.6 29.8 34.9 15.5 97.3 24.2 46.2 47.3
Water 98.1 99.9 99.3 100 99.7 100 99.6 100 99.7 100 99.8 98.3
OA, % / Kappa 82.6 / 0.7 87.4 / 0.78 88.7 / 0.8 88.9 / 0.82 89 / 0.82 90 / 0.83

Applications in Remote Sensing, edited by Dr. Ming-Chih Hung,


ACKNOWLEDGMENT InTech, 2018, pp. 21-40.
[11] A. Naumenko, V. Lukin, Egiazarian K., “SAR-image edge detection
Publication is based on the research provided within the using artificial neural network,” Proceedings of MMET 2012,
project №6386 “Intelligent technologies for satellite Kharkov, Ukraine, pp. 508-512.
monitoring of environment based on deep learning and cloud [12] R. A. Touzi, “Review of Speckle Filtering in the Context of
computing (InTeLLeCT)” which is supported by the Science Estimation Theory,” IEEE Trans. on GRS., 2002, vol. 40, № 11, pp.
and Technology Center in Ukraine (STCU) and the National 2392-2404.
Academy of Science of Ukraine (NASU). This work was [13] D.V. Fevralev, S.S. Krivenko, V.V. Lukin, R. Marques, F. de
also supported by European Commission “Horizon 2020 Medeiros, “Combining Level Sets and Orthogonal Transform for
Despeckling SAR Images,” Aerospace Engineering and Technology,
Program” that funded ERA-PLANET/GEOEssential and vol. 2, № 99, 2013, pp. 103-112.
ERA-PLANET/SMURBS (Grant Agreement no. 689443).
[14] S. Abramov, V. Abramova, V. Lukin, N. Ponomarenko, B. Vozel, K.
Chehdi, K. Egiazarian, J. Astola, Methods for Blind Estimation of
REFERENCES Speckle Variance in SAR Images: Simulation Results and
Verification for Real-Life Data, Book Chapter in Computational and
[1] R. Schowengerdt, “Remote Sensing: Models and Methods for Image Numerical Simulations, ISBN 978-953-51-1220-4, edited by Jan
Processing”. 3rd ed. Cambridge: Academic Press, 2006, p. 560, DOI: Awrejcewicz, InTech, Austria, 2014, pp. 303-327.
10.978.00804/80589
[15] https://earth.esa.int/documents/653194/656796/Speckle_Filtering.pdf
[2] M. Awad, “Crop Mapping Using Hyperspectral Data and
Technologies: A Comparison Between Different Supervised [16] P. Kupidura, “Comparison of Filters Dedicated to Speckle
Segmentation Algorithm”, Proceedings of 18th International Suppression in SAR Images,” ISPRS-International Archives of the
Multidisciplinary Scientific GeoConference SGEM, 2018, pp. 89-96, Photogrammetry, Remote Sensing and Spatial Information Sciences,
2018. pp. 269-276, 2016.
[3] A. N. Kravchenko, et al. "Water resource quality monitoring using [17] G. Vasile, E. Trouvé, J. S. Lee, and V. Buzuloiu “Intensity-driven
heterogeneous data and high-performance computations," Cybernetics adaptive-neighborhood technique for polarimetric and interferometric
and Systems Analysis vol. 44, no. 4, pp. 616-624, 2008. SAR parameters estimation,” IEEE Transactions on Geoscience and
Remote Sensing, vol. 44, no. 6, pp. 1609-1621, 2006.
[4] N. Kussul, S. Skakun, A. Shelestov, M. Lavreniuk, B. Yailymov, and
O. Kussul, “Regional Scale Crop Mapping Using Multi-Temporal [18] V. Lukin, N. Ponomarenko, S. Abramov, B. Vozel, K. Chehdi, J.
Satellite Imagery,” Int. Arch. Photogramm. Remote Sens. Spatial Inf. Astola, “Filtering of radar images based on blind evaluation of noise
Sci., XL-7/W3, pp. 45–52, 2015. DOI:10.5194/isprsarchives-XL-7- characteristics”, Proceedings of Image and Signal Processing for
W3-45-2015. Remote Sensing XIV, Cardiff, UK, SPIE Vol. 7108, p 12, 2008, DOI:
10.1117/12.799396
[5] Kussul, N., Skakun, S., Shelestov, A., & Kussul, O., “The use of
satellite SAR imagery to crop classification in Ukraine within [19] R. Kozhemiakin, V. Lukin, B. Vozel, K. Chehdi, “Filtering of Dual-
JECAM project”, IEEE International Geoscience and Remote Sensing Polarization Radar Images Based on Discrete Cosine Transform,”
Symposium (IGARSS), pp. 1497-1500, 2014. Proceedings of IRS, Gdansk, Poland, June 2014, pp. 1-4.
[6] N. Kussul, G. Lemoine, F. J. Gallego, S. V. Skakun, M. Lavreniuk, [20] C.A. Deledalle, L. Denis, S. Tabti, F. Tupin, “MuLoG, or how to
and A. Y. Shelestov, “Parcel-Based Crop Classification in Ukraine apply Gaussian denoisers to multi-channel SAR speckle reduction?”,
Using Landsat-8 Data and Sentinel-1A Data”, IEEE J. of Select. IEEE Transactions on Image Processing, Vol. 26, Issue 9, Sept. 2017,
Topics in Appl. Earth Observ. and Rem. Sens., vol. 9, no. 6, pp. pp. 4389-4403, DOI: 10.1109/TIP.2017.2713946
2500–2508, 2016. [21] M. Makitalo, A. Foi, D. Fevralev, V. Lukin, “Denoising of single-
[7] https://sentinel.esa.int/web/sentinel/user-guides/sentinel-1-sar look SAR images based on variance stabilization and non-local
filters”, CD-ROM Proceedings of MMET, Kiev, Ukraine, Sept. 2010,
[8] J.-S. Lee, E. Pottier, “Polarimetric Radar Imaging: From Basics to pp. 1-4, 2010.
Aplications,” CRC Press, 2009, p. 422.
[22] K. Dabov, A. Foi, V. Katkovnik, K. Egiazarian, "Image denoising by
[9] M. Lavreniuk, N. Kussul, M. Meretskii, V. Lukin, S. Abramov, O. sparse 3D transform-domain collaborative filtering", IEEE
Rubel, “Impact of SAR data filtering on crop classification accuracy”, Transactions on Image Processing, Vol. 16, Issue 8, 2007, pp. 2080–
Proceedins of UkrCon, May 2017, Kiev, Ukraine, pp. 912-916. 2095
[10] V. Lukin, O. Rubel, R. Kozhemiakin, S. Abramov, A. Shelestov, M. [23] M. Aharon, M. Elad, M., A.M. Bruckstein, "The K-SVD: An
Lavreniuk, M. Meretsky, B. Vozel, K. Chehdi, Despeckling of Algorithm for Designing of Overcomplete Dictionaries for Sparse
Multitemporal Sentinel SAR Images and Its Impact on Agricultural Representation", IEEE Transactions on Signal Processing, Vol. 11,
Area Classification, Book chapter in Recent Advances and Issue 54, 2006, pp. 4311–4322.

__________________________________________________________________________________________________________________
2019 IEEE 2nd Ukraine Conference on Electrical and Computer Engineering
Lviv, Ukraine, July 2-6, 2019
1076

[24] O. Rubel, V. Lukin, K. Egiazarian, “Additive Spatially Correlated [29] F. Waldner, et al. “Towards a set of agrosystem-specific cropland
Noise Suppression by Robust Block Matching and Adaptive 3D mapping methods to address the global cropland diersity,”
Filtering”, Journal of Imaging Science and Technology, Vol. 62, No International Journal of Remote Sensing, vol. 37, no. 14, p. 3196-
6, 2018, pp. 60401-1-60401-11. 3231, 2016. DOI: 10.1080/01431161.2016.1194545.
[25] A. Rubel, V. Lukin, K. Egiazarian, “Metric performance in similar [30] M. S. Lavreniuk, S. V. Skakun, A. J. Shelestov, B. Y. Yalimov, S. L.
blocks search and their use in collaborative 3D filtering of grayscale Yanchevskii, D. J. Yaschuk, and A. I. Kosteckiy, “Large-Scale
images”, Proceedings of SPIE 9019, Image Processing: Algorithms Classification of Land Cover Using Retrospective Satellite Data,”
and Systems XII, USA, February 2014. Cybernetics and Systems Analysis, vol. 52, no. 1, pp. 127–138, 2016.
[26] O. Tsymbal, V. Lukin, N. Ponomarenko, A. Zelensky, K. Egiazarian, [31] A. G. Mullissa, C. Persello, and V. Tolpekin, “Fully Convolutional
J. Astola, “Three-state locally adaptive texture preserving filter for Networks for Multi-Temporal SAR Image Classification,” IEEE
radar and optical image processing”, EURASIP Journal on Applied International Geoscience and Remote Sensing Symposium
Signal Processing. 2005(8), pp. 1185-1204. DOI: (IGARSS), pp. 6635-6638, 2018.
10.1155/ASP.2005.1185 [32] G. Scarpa, M. Gargiulo, A. Mazza, and R. Gaetano, “A CNN-Based
[27] V. Abramova, S. Abramov, V. Lukin, K. Egiazarian, “Blind Fusion Method for Feature Extraction from Sentinel Data,” Remote
Estimation of Speckle Characteristics for Sentinel Polarimetric Radar Sensing, vol. 10, no. 2, 236, 2018.
Images”, Proceedings of MRRS, Kiev, Ukraine, August 2017, pp. [33] N. Kussul, M. Lavreniuk, S. Skakun, and A. Shelestov, “Deep
263-266. Learning Classification of Land Cover and Crop Types Using Remote
[28] S. Skakun, N. Kussul, A. Y. Shelestov, M. Lavreniuk, and O. Kussul, Sensing Data,” IEEE Geoscience and Remote Sensing Letters, vol.
“Efficiency Assessment of Multitemporal C-Band Radarsat-2 14, no. 5, pp. 778-782, 2017.
Intensity and Landsat-8 Surface Reflectance Satellite Imagery for
Crop Classification in Ukraine,” IEEE J. of Select. Topics in Applied
Earth Obser. and Rem. Sens., vol. 9, no. 8, pp. 3712-3719, 2016.

__________________________________________________________________________________________________________________
2019 IEEE 2nd Ukraine Conference on Electrical and Computer Engineering
Lviv, Ukraine, July 2-6, 2019

You might also like