Insect Pest Image Detection and Classification Usi
Insect Pest Image Detection and Classification Usi
Abstract—Farmers' primary concern is to reduce crop loss awareness of automated pests’ classification because this
because of pests and diseases, which occur irrespective of the activity necessitates ongoing, intensive monitoring [1]. It is
cultivation process used. Worldwide more than 40% of the commonly known that distinct insect species may have
agricultural output is lost due to plant pathogens, insects, and phenotypes that are similar to one another and that due to
weed pests. Earlier farmers relied on agricultural experts to various habitats and growth cycles, insects can have intricate
detect pests. Recently Deep learning methods have been utilized morphologies [2] [3]. An outstanding method for
for insect pest detection to increase agricultural productivity. recognizing insect images has been made possible by the
This paper presents two deep learning models based on Faster R- development of machine learning techniques. Vehicle
CNN Efficient Net B4 and Faster R-CNN Efficient Net B7 for
recognition and motion detection have seen considerable
accurate insect pest detection and classification. We validated
our approach for 5, 10, and 15 class insect pests of the IP102
success utilizing computer vision as well as machine learning
dataset. The findings illustrate that our proposed Faster R-CNN techniques [4] [5]. A sizable pest dataset of 40 high-grade pest
Efficient Net B7 model achieved an average classification categories was labeled using a multi-level classification
accuracy of 99.00 %, 96.00 %, and 93.00 % for 5, 10, and 15 class framework of alignment-pooling method [6]. A dataset with
insect pests outperforming other existing models. To detect insect 563 pest images partitioned into 10 categories was used. To
pests less computation time is required for our proposed Faster- classify the dataset, training was done on a Support Vector
R-CNN method. The investigation reveals that our proposed Machine for custom features [7]. Various image processing
Faster R-CNN model can be used to identify crop pests resulting techniques to detect and retrieve insect pests by developing a
in higher agricultural yield and crop protection. machine-driven detection and removal system for evaluating
pest concentration in paddy crops [8]. To identify the pest
Keywords—Deep learning; faster RCNN; insect pest detection; from a dataset of pest images K mean segmentation technique
IP102 dataset; efficient net was implemented. In order to classify the pests, the discrete
cosine transform method was implemented and the pest
I. INTRODUCTION images were classified using an artificial neural
Agricultural production from field crops has advanced network. Images were validated for five pests and obtained an
quickly in both quantity and quality, but the prevalence of accuracy of 94.00 % [9]. Deep learning techniques like
pests and diseases on crops has limited the quality of agrarian convolutional neural networks have lately been used in
output. If pests on crops are not thoroughly inspected and a agricultural production as a viable approach for fully
sufficient, long-lasting treatment is not offered, the quality and automated pest classification [10]. The convolutional neural
amount of food production will be lowered, causing an networks exert a significant influence on image elements and
increase in poverty and food shortages. Any country's has their own feature extractor, which makes them superior to
economy might be negatively impacted by this, but it would conventional image processing techniques and machine
be most harmful in places where 60-70% of the populace learning. Additionally, in several applications of medical
relies completely on income from the agricultural sector to image analysis, convolutional neural networks demonstrated
support itself. Getting rid of pests that are growing and their ability to manage picture noise and illumination change
reducing crop production is a significant issue for agricultural [11]. In this study, a Faster R-CNN framework to detect and
producers. According to our research, a pest is any species that classify insect pests is investigated.
disperses disease and induces damage to the plants. Aphids,
flax budworm, flea beetle, cabbage butterfly, peachtree borer, The main contributions of this work are as follows:
prodenia litura, thrips and mole cricket are the most frequent 1) To detect and classify crop pests, a Faster R-CNN
pests that attack plants. In order to prevent a large amount of framework with Efficient Net is used. In order to improve the
loss and boost crop yields, it is necessary to identify these performance of the model the network drop connects is used
pests at all phases of their life cycles, whether they are nascent
to prevent over fitting and to increase regularization effect a
or advanced. Understanding and classifying insects is the
initial step in preventing crop damage caused by insect pests. swish function is utilized for Efficient Net.
This will allow us to distinguish between harmless insects and 2) The Region Proposal Network module and the
dangerous ones. In recent times, there has been a rise in bounding box regression can accurately predict the classes and
411 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022
locations of various crop pests. The computational time models by utilizing transfer learning and obtained an accuracy
required for detecting the insect pests is less. of 96.75, 97.47, and 95.97% [18].
3) Compared to other methods, the evaluation results of Wang et al. implemented a Multi-scale convolution
insect pest classification using the proposed Faster R-CNN capsule network for crop insect pest detection. The advantages
framework demonstrated superior performance. of MSCCN are that it is able to extract the multi-scale
discriminative features, encode the hierarchical structure of
II. RELATED WORK size-variant pests and for pest identification, softmax function
Several deep learning techniques have been used recently was used to determine the probability. They obtained an
to categorize pests and develop cutting-edge outcomes in accuracy of 89.6% for 9 classes of insect species [19]. Nour et
several applications for pest identification. Convolutional al. worked on the AlexNet model to recognize the pests for an
neural network and saliency techniques were used IP102 dataset. The model accuracy was fine-tuned by data
for classifying insect pests. Image processing algorithms augmentation to obtain an accuracy of 89.6 % for an eight-
known as saliency approaches emphasize the most important class insect pest [20]. Balakrishnan et al. implemented a real-
areas of an image. These techniques are based on the time IOT-based environment to detect pests using a faster
realization that the observer accurately distinguishes between RCNN ResNet50 model for object detection framework. The
the portions of its field of vision that are important and those model used 150 test images for each class of insects, 8 classes
that are not useful, rather than focusing on the entire range of of the IP102 dataset. The model average accuracy achieved for
vision. They obtained an accuracy of 92.43 % for the smaller eight-class insects is around 94.00 % [21]. Kasinathan et al.
dataset [12]. To classify the defected wheat granules for a implemented machine learning techniques such as ANN,
dataset of 300 images, an artificial bee colony, performance SVM, KNN, Naïve Bayes, and the CNN model for pest
tuning artificial neural network, and extreme learning machine detection and classification. The model achieved an accuracy
techniques are used [40]. A deep learning framework for of 91.5 % and 93.9 % for nine class and five class pests. The
multi-class fruit detection which includes fruits images along drawback of this model they have used 50 images for each
with data augmentation based on Faster RCNN was proposed class even though more images of the pests were available in
and the performance was evaluated [41]. For identifying pests the dataset of IP102 [22].
and plant diseases in video content, a deep learning-based
Mohamed et al. developed a mobile application that uses
Faster RCNN was investigated along with video based
deep learning to automatically classify pests and for the
performance metrics [42]. A survey paper of current
identification of insect pests, they used a Faster R-CNN
innovations in image processing methods for automated leaf
model. The model achieved an average accuracy of 98 % for
pest and disease recognition [43]. Adao et al. collected a
five pests. The drawback of this work, in training the image
dataset of cotton field images and implemented a deep
pests they have used a total of 500 image pests which results
residual design and classified the pests. F1-score of 0.98 was
in poor approximation, and few test data will result in an
achieved by using Resnet 34 model [44]. A metric for
optimistic and high variance estimation of prediction accuracy
accuracy degradation was utilized to analyze machine learning
[23]. In order to overcome the above approach, the proposed
algorithms by enhancing benign samples [24]. The natural
work was implemented by using a Faster R-CNN for detection
statistics model was applied to create saliency maps and and classification of pests for around 1449 pest images for
identify regions of interest in an insect pest image. Further testing of five pest classes, similarly for 10 classes is 2921
work was done on the bio-inspired Hierarchical model and X images and 15 classes are 4321 pest images of IP102 dataset.
(HMAX) method in the accompanying areas to retrieve
invariant features for representing pest appearance [13]. A. Insect Pests
Convolutional neural network-based frameworks, such as The proposed work, includes 15 classes of crop insect
attention, feature pyramid, and fine-grained modeling pests namely aphids, cicadellidae, flax budworm, flea beetle,
techniques for the IP102 dataset were implemented and cabbage butterfly, peachtree borer, prodenia litura, thrips, bird
obtained an accuracy of 74.00 % [14]. Chen. H. C et al. cherry-oat aphid, mole cricket, grub, wireworm, ampelophaga,
implemented the AlexNet-modified architecture-based lycoma delicatula and xylotrechus. Each class in the IP102
convolutional neural network model on the mobile application dataset is highly unbalanced, each class pest that contains
in order to identify tomato diseases utilizing leaf images. For a more images in the dataset is taken into consideration for the
9-class disease, the Alexnet model had a precision of 80.3% study. The following insect pests framework cause
[15]. Pest detection for 10 pest classes using an efficient considerable damage to the crops leading to a loss in crop
system for deep learning achieved an average accuracy of 70.5 productivity.
%. Yolov5-S model was used for the detection of pests and the
dataset used was IP102 [16]. A comparison of KNN, SVM, B. Faster R-CNN
Multilayer Perceptron, Faster R-CNN, and Single Shot Faster R-CNN requires an image to be scaled to a certain
Detector classifiers in distinguishing Bemisia Tabacii embryo length and width so that noise can be avoided and with the
and Trialeurodes Vaporariorum embryo tomato pest classes introduction of a region proposal network the detection speed
was implemented [17]. K. Thenmozhi used three types of the of insect pests is vastly improved [37]. The feature map is
dataset which include NBAIR, Xie1, and Xie2 for insect generated by the convolution neural network layers for
classification for 40 classes and 24 classes. Pre-trained deep processing the images and the identified object undergoes
learning techniques like AlexNet, ResNet, and VGGNet were location regression and classification. We evaluate the three
used for insect classification and fine-tuned with pre-trained important steps which are involved in Faster R-CNN. Feature
412 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022
413 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022
Fig. 3. Proposed Faster R-CNN Framework for Pest Detection and Classification.
C. Image Preprocessing and Augmentation sections of region proposals as given in (1). To improve the
Images are transformed to (600,600) in the pre-processing performance of the model and to reduce the noise, non-
stage phase to retain the same aspect ratio, and images are maximum suppression is utilized for identifying the bounding
normalized to maintain the standardized data distribution [25]. boxes with the highest confidence so that the small overlaps
The importance of data augmentation for image classification are ignored. The thresholds were kept at 0.7.
analysis has previously been proven due to insufficient 𝐴∩𝐵
𝐼𝑜𝑈 = (1)
datasets. The categories of each insect pest in the IP102 𝐴∪𝐵
dataset are highly unbalanced. To increase the data while To create the proposals for the object, the Faster R-CNN
avoiding the over-fitting problem, various data augmentation architecture is utilized. It has a specialized and unique
techniques such as rescaling, zooming, and horizontal flipping architecture that has got classifier and regressor. The Faster R-
have been used. Gaussian filter is first used to smooth the CNN is robust against translations, it’s one of the important
image. The images were rescaled, created a mask for every properties that it is translational invariant.
image, and then applied segmentation to each sample. Each
image in the dataset is subjected to the processing pipelining When multi-scale anchors are present, Faster R-CNN
by a function. creates a "Pyramid of Anchors" rather than a "Pyramid of
Filters," which consumes less time and is more cost efficient
D. Insect Pest Detection & Classification compared to various other architecture. The next step is to
The above-proposed learning architecture is used for pass the proposals to Region of Interest pooling layers. To
image processing and to detect and classify pests using the create a single feature map for each of the proposals provided
Efficient Net and Faster R-CNN approach as shown in Fig. 3. by RPN in a single pass, Region of Interest pooling is utilized.
The convolutional neural network layers of Efficient Net B4 It is implemented to address the issue of fixed image size
and Efficient Net B7 has been used as feature extractor in this difficulties with object detection. ROI pooling is utilized to
research and for Faster R-CNN because of their added create fixed-size feature maps against non-uniform inputs by
advantage of lightweight and its processing speed which is applying max-pooling across the inputs. This layer needs two
critical for our end application. The pre-trained weights of inputs: (i) A feature map obtained from a backbone of
Efficient Net were trained on the Image Net dataset. The size EfficientNet B4 or EfficientNet B7 used in our research
of the input image for this methodology is fixed at 224 x methodology after multiple convolutions and pooling layers.
224. Hence using the EfficientNet model we generate feature (ii) ‘N’ proposals or Region of Interests from region proposal
maps for an input image and pass it to the RPN. network (RPN).
The RPN takes these feature maps as an input to it and The benefit of Region of Interest pooling is that we can
provides a set of rectangular proposals (bounding box) utilize the corresponding feature map across all proposals,
identifying the object i.e, a pest in the convolutional neural allowing us to pass the whole image to the convolutional
network feature map as an output along with the objectness neural networks rather than passing each proposal
score. Grid-anchor having aspect ratio [0.25, 0.5, 1.0, 2.0] is separately. The sub-windows have a size of (N, 7, 7, 512)
started with a 16x16 pixel size during this stage. These which has been created by the Region of Interest pooling layer
anchors point to available objects of different sizes and aspect by applying max pooling over the next stage, where N
ratios at the corresponding location. Intersection over Union represents the number of region proposals obtained by the
determines how well the bounding box matches with the RPN network. The features are moved into the classifier and
ground truth of the insect pest image, where A and B are two regression sections after moving via two fully connected
414 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022
415 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022
416 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022
pest classes based on Faster R-CNN Efficient Net B7. For the
five pest classes, the accuracy ranges from 98 % to 100 %.
The Precision, recall, the F1 score is 99.00 % for 5 classes and
of 10 class pests test data it is around 96.00 % and for 15 class
pests test data is 93.00 %.
Fig. 12. Confusion Matrix for 5 Pest Classes for Faster R-CNN Efficient Net
B7 Model.
Fig. 15. Confusion Matrix for 5 Pest Classes for Faster R-CNN Efficient Net
B4 Model.
Fig. 13. Confusion Matrix for 10 Pest Classes for Faster R-CNN Efficient Net
B7 Model.
Fig. 16. Confusion Matrix for 10 Pest Classes for Faster R-CNN Efficient Net
B4 Model.
Fig. 14. Confusion Matrix for 15 Pest Classes for Faster R-CNN Efficient Net
B7 Model.
417 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022
Fig. 18. Classification Report for 5, 10 and 15 Pest Classes for Faster R-CNN Efficient Net B7.
Fig. 19, illustrates the classification report for the test pest images is increased by 70 % to 90 %, the classification
dataset for 5, 10 and 15 pest classes using Faster R-CNN accuracy improves. When compared to 30 % of test data our
Efficient Net B4. Classification Accuracy of 98.00 %, 95.00 Faster R-CNN model has an accuracy of 95.5.%, whereas for
%, and 90.00 % is achieved for 5, 10, and 15 pest classes BP Neural Network and SSD Mobile Net is around 43.5 %
based on Faster R-CNN Efficient Net B4. The Precision, and 85.70 % as shown in Fig. 20.
recall, and F1 score is 98.00 % for 5 classes, 10 class pest is
around 95.00 % and 15 class pests is 90 %. Comparison was done for the other existing methods for 9
and 10-class crop pests as shown in Fig. 21. Among all these
F. Comparative Analysis models, bio-inspired methods achieved an accuracy score of
Y. Liu et al. investigated using Back-propagation Neural 92.50 % [12], with inference we can say that these models
Network for five pest class dataset of IP102 dataset and used deep learning methodology to detect crop pests. Our
achieved an accuracy of 63 %, 50 %, and 43.5 % of proposed Faster R-CNN model outperforms other existing
classification accuracy for 10 %, 20 %, and 30 % test data set methods and achieved an average accuracy score of 96.00 %
[34]. When compared to BP Neural Network, the Single shot for a 10-class crop pest test dataset.
Multi-box detector performed better for identifying the crop The performance of the proposed method Faster R-CNN
pests and achieved an accuracy of 90.6 % for a five pest class Efficient Net B7 and Faster R-CNN Efficient Net B4 is
[35]. Kasinathan et al. proposed a CNN model for five pest compared with existing methods for the IP102 dataset as
classes and obtained an accuracy of 93.9 % [22]. Our Faster- shown in Table I. From Table I we can infer that the proposed
CNN model outperformed when compared with the other two Faster R-CNN Efficient Net B7 method outperforms the latest
models for recognizing the pests and obtained a classification competitive approaches in terms of Accuracy for 5 and 10
accuracy of 99.00 % for 10 % of test data, 98.4 % for 20 % of class crop pests.
test data, and 95.5 % for test data. When the training of the
Fig. 19. Classification Report for 5, 10 and 15 Pest Classes for Faster R-CNN Efficient Net B4.
418 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022
120
99 98.4 95.5
Classification Accuracy
100 90.6
86 85.7
80
63
60 50
43.5
40
20
0
BP Neural Network SSD Mobile Net Proposed Faster R-CNN
Pest Classifiers
95 89.6 91.5
90 84.7 85.5
82.4
85
80
75
CapsNet DCNNT MS-CapsNet ResNet50 Bio-inspired CNN Proposed Faster
Model R-CNN
Crop pest classifiers
419 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022
[14] Hieu T. Ung , Huy Q. Ung , Binh T,Nguyen.”An Efficient Insect Pest
V. CONCLUSION Classification Using Multiple, Convolutional Neural Network Based
Models”, arXiv:2107.12189v1 [cs.CV] 26 Jul 2021.
In this study, the investigation was done on the Faster R- [15] ChenH.C, Widodo A.M, Wisnujati A, Rahaman M, Lin J.C.W, Chen L,
CNN method to detect and classify different insect pests for 5, Weng C.E,” AlexNet Convolutional Neural Network for Disease
10, and 15 classes, and the results were compared. To improve Detection and Classification of Tomato Leaf,” Electronics 2022, 11,
the performance and accuracy, each one of the pest images has 951. [CrossRef].
been resized, pre-processed, and augmented to increase the [16] Thanh-Nghi Doan, “An Efficient System for Real-time Mobile Smart
dataset. When the image background is more challenging and Device-based Insect Detection,” International Journal of Advanced
Computer Science and Applications,Vol. 13, No. 6, 2022.
the insect classes are more numerous, as in the IP102 dataset,
our proposed Faster R-CNN Efficient B7 model achieved an [17] Gutierrez A, A. Ansuategi, L. Susperregi, C. Tubío, I. Rankić, and L.
Lenža. 2019,” A benchmarking of learning strategies for pest detection
average classification accuracy of 99.00 %, 96.00 %, and and identification on tomato plants for autonomous scouting robots
93.00 % for 5, 10, and 15 class insect pests outperforming using internal databases,” Journal of Sensors 2019.
other existing models such as SSD Mobile Net, Bio-inspired [18] Thenmozhi K, Reddy,” U.S. Crop Pest Classification Based on Deep
method and Faster R-CNN ResNet 50. In future work, the Convolutional Neural Network and Transfer Learning,” Comput.
proposed Faster R-CNN model will be used for higher number Electron. Agric. 2019, 164, 104906. [CrossRef].
of insect classes and subclasses of insect pests that will be [19] Wang D, Xu Q, Xiao Y, Tang J, Bin L,” Multi-scale Convolutional
useful for farmers to detect insect pests for detection and Capsule Network for Hyperspectral Image Classification,” In Chinese
Conference on Pattern Recognition and Computer Vision; Springer
classification. International Publishing: Cham, Switerland, 2019;pp. 749–760.
REFERENCES [CrossRef].
[1] Xie, C. Zhang. J, Li. R, Li. J, Hong. P, Xia. J, Chen. P, “Automatic [20] Nour Khalifa, N.E, Loey M, Taha M,”Insect Pests Recognition Based
classification for field crop insects via multiple-task sparse on Deep Transfer Learning Models,” J. Theor. Appl. Inf. Technol. 2020,
representation and Multiple kernel learning “Comput. Electron 98, 60–68. [CrossRef].
2015,119, 123–132. [21] Balakrishnan Ramalingam, Mohan R.E, Pookkuttath S, Gómez B.F,
[2] Gaston, K. J, “The Magnitude of Global Insect Species Richness,” Sairam Borusu C.S.C, Wee Teng T, Tamilselvam Y.K,” Remote
Conserv. Biol. 2010, 5, 283–296. Insects Trap Monitoring System Using Deep Learning Framework and
IoT,” Sensors 2020, 20, 5280. [CrossRef].
[3] Siemann E, Tilman. D, Haarstad J,” Insect species diversity, abundance
and body size relationships,” Nature 1996, 380, 704–706. [22] Kasinathan T, Singaraju D, Uyyala S.R, “ Insect Classification and
Detection in Field Crops Using Modern Machine Learning
[4] Zhang H,Huo Q, Ding W, ”The application of AdaBoost-neural network Techniques,” Inf. Process. Agric. 2021, 8, 446–457. [CrossRef].
in stored product insect classification,” In Proceedings of the IEEE
International Symposium on It in Medicine and Education,” Xiamen, [23] Mohammed Esmail Karar, Alsunaydi F, Albusaymi S, Alotaibi S,” A
China, 12–14 December 2009; pp. 973–976. New Mobile Application of Agricultural Pests Recognition Using Deep
Learning in Cloud Computing System,”Alex. Eng. J. 2021, 60, 4423–
[5] Wu X. Zhan C, Lai Y K, Cheng M.M, Yang. J, “IP102: A Large-scale 4432. [CrossRef].
Benchmark Dataset for Insect Pest recognition,” In Proceedings of the
IEEE/CVF Conference on Computer Vision and Pattern Recognition, [24] Mamoru Mimura, ”Impact of benign sample size on binary classification
Long Beach, CA, USA, 15–20 June 2019, pp. 8779–8788. accuracy”, Expert Systems With applications,volume211,January
2022,118630
[6] Xie C, Wang R, Zhang J, Chen P, Li R, Chen T,Chen H,” Multi-level
learning features for automatic classification of field crop pests,” [25] Tang Yu,Wang Chen, Gao Junfeng and Hua Poxi,” Intelligent Detection
Comput. Electron. Agric. 2018, 152, 233–241. Method of Forgings Defects Detection Based on Improved EfficientNet
and Memetic Algorithm” IEEE Access”,Digital Object Identifier
[7] Deng L, “Research on insect pest image detection and recognition based 10.1109/ACCESS.2022.3193676.
on bio-inspired methods,” Biosystems Engineering. Elsevier, 169, pp.
139–148. [26] M. Sokolova, G. Lapalme,” A systematic analysis of performance
measures for classification tasks,” Inf. Process. Manage. 45 (2009) 427–
[8] Johny L Miranda,B Gerado, Bartolome T Tanguilg, “ Pest Detection 437.
and Extraction Using Image Processing Techniques “International
Journal of communication and Engineering,” [27] Iandola F.N, Han S.Moskewicz, M.W, Ashraf K, Dally W.J, Keutzer,
DOI:10.7763/IJCCE.2014.V3.317 Corpus ID: 8891485. K,” SqueezeNet: AlexNet-level Accuracy with 50x Fewer Parameters
and 0.5 MB Model Size,” arxiv 2016, arXiv:1602.07360. [CrossRef]
[9] Faithpraise Fina, Philip Birch, Rupert Young , J. Obu , Bassey
Faithpraise and Chris Chatwin,” Automatic Plant Pest detection and [28] Ning C, Zhou H, Song Y, Tang J, “Inception Single Shot MultiBox
recognition using k-means clustering algorithm and correspondence Detector for Object Detection,” In Proceedings of the 2017 , IEEE
filters, “ International Journal of Advanced Biotechnology and International Conference on Multimedia & Expo Workshops, Hong
Research ISSN 0976-2612, Online ISSN 2278–599X, Vol 4, Issue 2, Kong, China, 10–14 July 2017; pp. 549–554.[CrossRef].
2013, pp 189-199. [29] Li Y, Qian M, Liu P, Cai Q, Li X,Guo J, Yan H, Yu F, Yuan K, Yu J.et
[10] Xi cheng, Youhua Zhang, Yigiong Chen, Yunzhi Wu,Yi Yue,” Pest ,” The Recognition of Rice Images by UAV Based on Capsule
identification via deep residual learning in complex Network,” Clust. Comput. 2018, 22, 9515–9524. [CrossRef].
background”,Computers and Electronics in agriculture,volume [30] Cui J, Zhang J, Sun G, Zheng B,” Extraction and Research of Crop
141, September 2017, Pages 351-356. Feature Points Based on Computer Vision,” Sensors 2019, 19,2553.
[11] M.E. Karar, “Robust RBF neural network–based backstepping controller [CrossRef].
for implantable cardiac pacemakers, “ Int.J. Adapt Control Signal [31] Yan P, Su Y, Tian X,” Classification of Mars Lineament and Non-
Process. 32 (2018) 1040–1051. lineament Structure Based on ResNet50,” In Proceedings of the 2020
[12] Loris Nanni, Gianluca Maguolo, Fabio Pancino,” Insect pest image IEEE International Conference on Advances in Electrical Engineering
detection and recognition based on bio-inspired methods”, Ecological- and Computer Applications, Dalian, China, 25–27 August 2020; pp.
Informatics, https://doi.org/10.1016/j.ecoinf.2020.101089. [CrossRef] 437–441.[CrossRef].
[13] Limiao Deng, Yanjiang Wang, Zhong zhiHan, Renshi Yu, “Research on [32] Chen H.C, Widodo A.M, Wisnujati A, Rahaman M, Lin J.C.W, Chen
insect pest image detection and recognition based on bio-inspired- L, Weng C.E,”AlexNet Convolutional Neural Network for Disease
methods,” https://www.sciencedirect.com/journal/biosystems- Detection and Classification of Tomato Leaf,” Electronics 2022, 11,
engineering. 951. [CrossRef].
420 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022
[33] Xu C,Yu C, Zhang S, Wang X,” Multi-Scale Convolution-Capsule [39] Mingxing Tan, Quoc V. Le,”EfficientNet: Rethinking Model Scaling for
Network for Crop Insect Pest Recognition,” Electronics 2022,11, 1630. convolutional Neural Networks,” Proceedings of the 36 th International
[CrossRef]. Conference on Machine, Learning, Long Beach,California, PMLR 97,
[34] Y. Liu, W. Jing, L. Xu, “Parallelizing Backpropagation Neural Network 2019.
Using MapReduce and Cascading model,” Comput. Intell. Neurosci. [40] Sabanci, K., “Detection of sunn pest‐damaged wheat grains using
2016 (2016) 2842780. [CrossRef]. artificial bee colony optimization‐based artificial intelligence
[35] W. Liu, D. Anguelov, D. Erhan, C. Szegedy, S. Reed, C.-Y. Fu, A.C. techniques”, Journal of the Science of Food and Agriculture, 100(2),
Berg, “SSD: Single Shot MultiBox Detector, “in: B.Leibe, J. Matas, N. pp.817-824.
Sebe, M. Welling (Eds.), Computer Vision – ECCV 2016, Springer [41] Shaohua Wan, SotiriosGoudos,"Faster R-CNN for multi-class fruit
International Publishing, Cham, 2016, pp. 21–37.[CrossRef]. detection using a robotic vision system",Computer Networks,
[36] A.G. Howard, M. Zhu, B. Chen, D. Kalenichenko, W. Wang, T. vol.168,February 2020.
Weyand, M. Andreetto, H. Adam, “MobileNets: Efficient [42] DengshanLi ,Rujing Wang, ChengjunXie , Liu Liu , Jie Zhang, Rui Li,
Convolutional Neural Networks for Mobile Vision Applications,” Fangyuan Wang, Man Zhou and Wancai Liu, ”A Recognition Method
Computer vision and Pattern Recognition arXiV: 1704.4861 (2017). for Rice Plant Diseases and Pests Video Detection Based on Deep
[37] R. Girshick, ‘‘Fast R-CNN,’’ in Proc. IEEE Int. Conf. Comput. Vis. Convolutional Neural Network”. Sensors 2020, 20, 578;
(ICCV), Santiago, Chile, Dec. 2015, pp. 14 40–1448. doi: doi:10.3390/s20030578.
10.1109/ICCV.2015.169. [43] Ngugi, L.C, Abelwahab, M, Abo-Zahhad, M. “Recent advances in
[38] G. Huang, Z. Liu, L. Van Der Maaten, K.Q. Weinberger,” Densely image processing techniques for automated leaf pest and disease
connected convolutional networks, Proceedings – 30th IEEE Conference recognition “A review. Inf. Process. Agric. 2021, 8, 27–51.
on Computer Vision and Pattern Recognition, “ CVPR 2017, Institute of [44] Alves A. N., Souza W. S. R., Borges D. L. “Cotton pests classification
Electrical and Electronics Engineers Inc., 2017, pp. 2261–2269. in field-based images using deep residual networks”, Computers and
Electronics in Agriculture 2020;174.
421 | P a g e
www.ijacsa.thesai.org