0% found this document useful (0 votes)
2 views

Insect Pest Image Detection and Classification Usi

This paper discusses the use of deep learning models, specifically Faster R-CNN with Efficient Net B4 and B7, for the detection and classification of insect pests to reduce agricultural losses. The proposed models achieved high classification accuracy of up to 99% across multiple pest classes using the IP102 dataset, demonstrating superior performance compared to existing methods. The study emphasizes the importance of automated pest detection in enhancing agricultural productivity and crop protection.

Uploaded by

mniju2004
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Insect Pest Image Detection and Classification Usi

This paper discusses the use of deep learning models, specifically Faster R-CNN with Efficient Net B4 and B7, for the detection and classification of insect pests to reduce agricultural losses. The proposed models achieved high classification accuracy of up to 99% across multiple pest classes using the IP102 dataset, demonstrating superior performance compared to existing methods. The study emphasizes the importance of automated pest detection in enhancing agricultural productivity and crop protection.

Uploaded by

mniju2004
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

(IJACSA) International Journal of Advanced Computer Science and Applications,

Vol. 13, No. 9, 2022

Insect Pest Image Detection and Classification using


Deep Learning
Niranjan C Kundur1
P B Mallikarjuna2
Assistant Professor
Associate Professor
Department of Computer Science and Engineering
Department of Computer Science and Engineering
JSS Academy of Technical Education, Bengaluru, India
JSS Academy of Technical Education, Bengaluru, India

Abstract—Farmers' primary concern is to reduce crop loss awareness of automated pests’ classification because this
because of pests and diseases, which occur irrespective of the activity necessitates ongoing, intensive monitoring [1]. It is
cultivation process used. Worldwide more than 40% of the commonly known that distinct insect species may have
agricultural output is lost due to plant pathogens, insects, and phenotypes that are similar to one another and that due to
weed pests. Earlier farmers relied on agricultural experts to various habitats and growth cycles, insects can have intricate
detect pests. Recently Deep learning methods have been utilized morphologies [2] [3]. An outstanding method for
for insect pest detection to increase agricultural productivity. recognizing insect images has been made possible by the
This paper presents two deep learning models based on Faster R- development of machine learning techniques. Vehicle
CNN Efficient Net B4 and Faster R-CNN Efficient Net B7 for
recognition and motion detection have seen considerable
accurate insect pest detection and classification. We validated
our approach for 5, 10, and 15 class insect pests of the IP102
success utilizing computer vision as well as machine learning
dataset. The findings illustrate that our proposed Faster R-CNN techniques [4] [5]. A sizable pest dataset of 40 high-grade pest
Efficient Net B7 model achieved an average classification categories was labeled using a multi-level classification
accuracy of 99.00 %, 96.00 %, and 93.00 % for 5, 10, and 15 class framework of alignment-pooling method [6]. A dataset with
insect pests outperforming other existing models. To detect insect 563 pest images partitioned into 10 categories was used. To
pests less computation time is required for our proposed Faster- classify the dataset, training was done on a Support Vector
R-CNN method. The investigation reveals that our proposed Machine for custom features [7]. Various image processing
Faster R-CNN model can be used to identify crop pests resulting techniques to detect and retrieve insect pests by developing a
in higher agricultural yield and crop protection. machine-driven detection and removal system for evaluating
pest concentration in paddy crops [8]. To identify the pest
Keywords—Deep learning; faster RCNN; insect pest detection; from a dataset of pest images K mean segmentation technique
IP102 dataset; efficient net was implemented. In order to classify the pests, the discrete
cosine transform method was implemented and the pest
I. INTRODUCTION images were classified using an artificial neural
Agricultural production from field crops has advanced network. Images were validated for five pests and obtained an
quickly in both quantity and quality, but the prevalence of accuracy of 94.00 % [9]. Deep learning techniques like
pests and diseases on crops has limited the quality of agrarian convolutional neural networks have lately been used in
output. If pests on crops are not thoroughly inspected and a agricultural production as a viable approach for fully
sufficient, long-lasting treatment is not offered, the quality and automated pest classification [10]. The convolutional neural
amount of food production will be lowered, causing an networks exert a significant influence on image elements and
increase in poverty and food shortages. Any country's has their own feature extractor, which makes them superior to
economy might be negatively impacted by this, but it would conventional image processing techniques and machine
be most harmful in places where 60-70% of the populace learning. Additionally, in several applications of medical
relies completely on income from the agricultural sector to image analysis, convolutional neural networks demonstrated
support itself. Getting rid of pests that are growing and their ability to manage picture noise and illumination change
reducing crop production is a significant issue for agricultural [11]. In this study, a Faster R-CNN framework to detect and
producers. According to our research, a pest is any species that classify insect pests is investigated.
disperses disease and induces damage to the plants. Aphids,
flax budworm, flea beetle, cabbage butterfly, peachtree borer, The main contributions of this work are as follows:
prodenia litura, thrips and mole cricket are the most frequent 1) To detect and classify crop pests, a Faster R-CNN
pests that attack plants. In order to prevent a large amount of framework with Efficient Net is used. In order to improve the
loss and boost crop yields, it is necessary to identify these performance of the model the network drop connects is used
pests at all phases of their life cycles, whether they are nascent
to prevent over fitting and to increase regularization effect a
or advanced. Understanding and classifying insects is the
initial step in preventing crop damage caused by insect pests. swish function is utilized for Efficient Net.
This will allow us to distinguish between harmless insects and 2) The Region Proposal Network module and the
dangerous ones. In recent times, there has been a rise in bounding box regression can accurately predict the classes and

411 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022

locations of various crop pests. The computational time models by utilizing transfer learning and obtained an accuracy
required for detecting the insect pests is less. of 96.75, 97.47, and 95.97% [18].
3) Compared to other methods, the evaluation results of Wang et al. implemented a Multi-scale convolution
insect pest classification using the proposed Faster R-CNN capsule network for crop insect pest detection. The advantages
framework demonstrated superior performance. of MSCCN are that it is able to extract the multi-scale
discriminative features, encode the hierarchical structure of
II. RELATED WORK size-variant pests and for pest identification, softmax function
Several deep learning techniques have been used recently was used to determine the probability. They obtained an
to categorize pests and develop cutting-edge outcomes in accuracy of 89.6% for 9 classes of insect species [19]. Nour et
several applications for pest identification. Convolutional al. worked on the AlexNet model to recognize the pests for an
neural network and saliency techniques were used IP102 dataset. The model accuracy was fine-tuned by data
for classifying insect pests. Image processing algorithms augmentation to obtain an accuracy of 89.6 % for an eight-
known as saliency approaches emphasize the most important class insect pest [20]. Balakrishnan et al. implemented a real-
areas of an image. These techniques are based on the time IOT-based environment to detect pests using a faster
realization that the observer accurately distinguishes between RCNN ResNet50 model for object detection framework. The
the portions of its field of vision that are important and those model used 150 test images for each class of insects, 8 classes
that are not useful, rather than focusing on the entire range of of the IP102 dataset. The model average accuracy achieved for
vision. They obtained an accuracy of 92.43 % for the smaller eight-class insects is around 94.00 % [21]. Kasinathan et al.
dataset [12]. To classify the defected wheat granules for a implemented machine learning techniques such as ANN,
dataset of 300 images, an artificial bee colony, performance SVM, KNN, Naïve Bayes, and the CNN model for pest
tuning artificial neural network, and extreme learning machine detection and classification. The model achieved an accuracy
techniques are used [40]. A deep learning framework for of 91.5 % and 93.9 % for nine class and five class pests. The
multi-class fruit detection which includes fruits images along drawback of this model they have used 50 images for each
with data augmentation based on Faster RCNN was proposed class even though more images of the pests were available in
and the performance was evaluated [41]. For identifying pests the dataset of IP102 [22].
and plant diseases in video content, a deep learning-based
Mohamed et al. developed a mobile application that uses
Faster RCNN was investigated along with video based
deep learning to automatically classify pests and for the
performance metrics [42]. A survey paper of current
identification of insect pests, they used a Faster R-CNN
innovations in image processing methods for automated leaf
model. The model achieved an average accuracy of 98 % for
pest and disease recognition [43]. Adao et al. collected a
five pests. The drawback of this work, in training the image
dataset of cotton field images and implemented a deep
pests they have used a total of 500 image pests which results
residual design and classified the pests. F1-score of 0.98 was
in poor approximation, and few test data will result in an
achieved by using Resnet 34 model [44]. A metric for
optimistic and high variance estimation of prediction accuracy
accuracy degradation was utilized to analyze machine learning
[23]. In order to overcome the above approach, the proposed
algorithms by enhancing benign samples [24]. The natural
work was implemented by using a Faster R-CNN for detection
statistics model was applied to create saliency maps and and classification of pests for around 1449 pest images for
identify regions of interest in an insect pest image. Further testing of five pest classes, similarly for 10 classes is 2921
work was done on the bio-inspired Hierarchical model and X images and 15 classes are 4321 pest images of IP102 dataset.
(HMAX) method in the accompanying areas to retrieve
invariant features for representing pest appearance [13]. A. Insect Pests
Convolutional neural network-based frameworks, such as The proposed work, includes 15 classes of crop insect
attention, feature pyramid, and fine-grained modeling pests namely aphids, cicadellidae, flax budworm, flea beetle,
techniques for the IP102 dataset were implemented and cabbage butterfly, peachtree borer, prodenia litura, thrips, bird
obtained an accuracy of 74.00 % [14]. Chen. H. C et al. cherry-oat aphid, mole cricket, grub, wireworm, ampelophaga,
implemented the AlexNet-modified architecture-based lycoma delicatula and xylotrechus. Each class in the IP102
convolutional neural network model on the mobile application dataset is highly unbalanced, each class pest that contains
in order to identify tomato diseases utilizing leaf images. For a more images in the dataset is taken into consideration for the
9-class disease, the Alexnet model had a precision of 80.3% study. The following insect pests framework cause
[15]. Pest detection for 10 pest classes using an efficient considerable damage to the crops leading to a loss in crop
system for deep learning achieved an average accuracy of 70.5 productivity.
%. Yolov5-S model was used for the detection of pests and the
dataset used was IP102 [16]. A comparison of KNN, SVM, B. Faster R-CNN
Multilayer Perceptron, Faster R-CNN, and Single Shot Faster R-CNN requires an image to be scaled to a certain
Detector classifiers in distinguishing Bemisia Tabacii embryo length and width so that noise can be avoided and with the
and Trialeurodes Vaporariorum embryo tomato pest classes introduction of a region proposal network the detection speed
was implemented [17]. K. Thenmozhi used three types of the of insect pests is vastly improved [37]. The feature map is
dataset which include NBAIR, Xie1, and Xie2 for insect generated by the convolution neural network layers for
classification for 40 classes and 24 classes. Pre-trained deep processing the images and the identified object undergoes
learning techniques like AlexNet, ResNet, and VGGNet were location regression and classification. We evaluate the three
used for insect classification and fine-tuned with pre-trained important steps which are involved in Faster R-CNN. Feature

412 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022

maps were obtained from a pre-trained convolutional neural


networks framework in particular Efficient Net [25], Resnet
50 [21], and Dense convolutional neural networks [38]. Next,
the Region Proposal network generates the region proposals to
detect the pest's locations in the image. The regression box
provides the exact location of the insect pests. The insect pest
image processed by the region generative proposal is sent to
the region of interest pooling to identify and predict the
accurate location of the insect pest image. Fig. 3 depicts the
proposed Faster R-CNN framework for detection and
classification. Fig. 1. Efficient B7 Architecture.
III. MATERIALS AND METHODS
A. Dataset
Efficient Net is a unique scaling method that uniformly
Capturing pest images is a difficult task although all insect
scales all depth/width/resolution dimensions using a
pests go through several phases during their entire life, based
compound coefficient. Neural architecture search is used to
on the species and category of pest. IP102 dataset is
generate a brand-new baseline network and scale it up to
commonly used to test the insect pest for classification and
create the Efficient Nets family of modeling techniques, which
detection based on deep learning methods [22]. As a result, we
outperform prior convolutional networks in both efficiency
utilized pest images from the public IP102 dataset. The dataset
and accuracy, reducing parameter size and FLOPS [39]. Width
has around 75000 images pertaining to 102 insect pest species.
scaling is the process of changing the width of an input image.
For detection and classification, we have chosen 5, 10, and 15
The larger the image, the more feature maps/channels are
classes of insect pests. Dataset of 14490 pest images for the
possible, and thus the more information is available to process
training of five pest classes, 29210 images for 10 pest classes,
[36]. Resolution scaling is the process of changing the
and 43210 images for 15 pest classes. The pest images were
resolution of an image. The higher an image's Dots per inch,
split in the ratio of 80 % training, 10 % validation, and 10 %
the higher its resolution. Better resolution is simply an
for testing. Sample images of insect pests are shown in Fig. 2.
augmentation in the number of pixels in an image. To scale
the three dimensions, a baseline model called Efficient Net B0 B. Proposed Framework for Detection and Classification
was introduced. There are seven Efficient Net models ranging The pest images of the IP102 dataset are passed to
from B0-B7, where B0 is the baseline model. The size of the Efficient Net network and are pre-trained on ImageNet to
incoming image varies between models. As the model level generate feature map. In order to improve the performance,
increases, so does the image's input size. This flexible scaling network Drop connect and Swish function is utilised in
strategy can be utilized to effectively scale Convolutional Efficient Net. The feature map is passed to the RPN network
Neural Networks and enhance the accuracy with a variety of to generate the bounding box and proposal score for the pest
frameworks. images. The output of RPN network and feature map obtained
The input image is processed by MBConv bottlenecks in from the Efficient Net algorithm is passed to ROI pooling for
which direct connections are used because between detection and classification of pest images. Further the flow of
bottlenecks with significantly fewer channels than expansion the Proposed Faster R-CNN framework for Pest detection and
layers in inverted residual blocks as shown in Fig. 1. MB Classification is explained in detail in the below following
Conv has an attention blocks and are made up of a layer that Section 3C and 3D.
expands and then compresses the channels, mechanism that
allows it to optimize channel features that contain the
highest information while restricting less significant channel
features. The gradient of MBConv does not quickly vanish
when the network depth is more thereby improving the model
performance. The regularization effect can be increased by
using a swish function with no upper limit wherein gradient
saturation will not occur [25]. In order to improve the
performance, the network drop connect is used to prevent
over-fitting. The Efficient Net B4 and Efficient Net B7 model
consists of nine phases with respect to Blocks. Blocks provide
effective layers and their feature map is connected to the
Region Proposal network and Region of Interest pooling as
shown in Fig. 3.

Fig. 2. Samples of Pests Images from the IP102 Dataset.

413 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022

Fig. 3. Proposed Faster R-CNN Framework for Pest Detection and Classification.

C. Image Preprocessing and Augmentation sections of region proposals as given in (1). To improve the
Images are transformed to (600,600) in the pre-processing performance of the model and to reduce the noise, non-
stage phase to retain the same aspect ratio, and images are maximum suppression is utilized for identifying the bounding
normalized to maintain the standardized data distribution [25]. boxes with the highest confidence so that the small overlaps
The importance of data augmentation for image classification are ignored. The thresholds were kept at 0.7.
analysis has previously been proven due to insufficient 𝐴∩𝐵
𝐼𝑜𝑈 = (1)
datasets. The categories of each insect pest in the IP102 𝐴∪𝐵
dataset are highly unbalanced. To increase the data while To create the proposals for the object, the Faster R-CNN
avoiding the over-fitting problem, various data augmentation architecture is utilized. It has a specialized and unique
techniques such as rescaling, zooming, and horizontal flipping architecture that has got classifier and regressor. The Faster R-
have been used. Gaussian filter is first used to smooth the CNN is robust against translations, it’s one of the important
image. The images were rescaled, created a mask for every properties that it is translational invariant.
image, and then applied segmentation to each sample. Each
image in the dataset is subjected to the processing pipelining When multi-scale anchors are present, Faster R-CNN
by a function. creates a "Pyramid of Anchors" rather than a "Pyramid of
Filters," which consumes less time and is more cost efficient
D. Insect Pest Detection & Classification compared to various other architecture. The next step is to
The above-proposed learning architecture is used for pass the proposals to Region of Interest pooling layers. To
image processing and to detect and classify pests using the create a single feature map for each of the proposals provided
Efficient Net and Faster R-CNN approach as shown in Fig. 3. by RPN in a single pass, Region of Interest pooling is utilized.
The convolutional neural network layers of Efficient Net B4 It is implemented to address the issue of fixed image size
and Efficient Net B7 has been used as feature extractor in this difficulties with object detection. ROI pooling is utilized to
research and for Faster R-CNN because of their added create fixed-size feature maps against non-uniform inputs by
advantage of lightweight and its processing speed which is applying max-pooling across the inputs. This layer needs two
critical for our end application. The pre-trained weights of inputs: (i) A feature map obtained from a backbone of
Efficient Net were trained on the Image Net dataset. The size EfficientNet B4 or EfficientNet B7 used in our research
of the input image for this methodology is fixed at 224 x methodology after multiple convolutions and pooling layers.
224. Hence using the EfficientNet model we generate feature (ii) ‘N’ proposals or Region of Interests from region proposal
maps for an input image and pass it to the RPN. network (RPN).
The RPN takes these feature maps as an input to it and The benefit of Region of Interest pooling is that we can
provides a set of rectangular proposals (bounding box) utilize the corresponding feature map across all proposals,
identifying the object i.e, a pest in the convolutional neural allowing us to pass the whole image to the convolutional
network feature map as an output along with the objectness neural networks rather than passing each proposal
score. Grid-anchor having aspect ratio [0.25, 0.5, 1.0, 2.0] is separately. The sub-windows have a size of (N, 7, 7, 512)
started with a 16x16 pixel size during this stage. These which has been created by the Region of Interest pooling layer
anchors point to available objects of different sizes and aspect by applying max pooling over the next stage, where N
ratios at the corresponding location. Intersection over Union represents the number of region proposals obtained by the
determines how well the bounding box matches with the RPN network. The features are moved into the classifier and
ground truth of the insect pest image, where A and B are two regression sections after moving via two fully connected

414 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022

layers. Using the softmax function, the classification division


evaluates the probability of a region proposal comprising an
insect pest. Additionally, Intersection over Union values are
used to evaluate the accuracy of the bounding box generated
surrounding the insect pest. The anchor box coordinates are
provided by the bounding box regression.
E. Classification Performance Metrics
The performance for identifying the insects is measured by
using rotation estimation for validating the insect pests of
tested images with predicted classification results of the Faster
R-CNN technique [26]. The Confusion Metrics are evaluated
by True Positive (TP), True Negative (TN), False Positive
(FP), and False Negative (FN). The TP indicates the current
predicted insect pest class category that is correctly classified.
The TN pertains to other groups that do not belong to the
existing insect pest class category. The FP pertains to other
insect pest class category incorrectly classified as the current
insect pest class type. The FN relates to the current insect pest
class category that was incorrectly classified and did not
belong to the existing class. Precision metric indicates out of
all points that are predicted to be positive, how many are
actually Positive. The recall metric indicates out of all positive
points, how many are actually positive. The classification
metrics is given below.
𝑡𝑝+𝑡𝑛
𝑎𝑐𝑐𝑢𝑟𝑎𝑐𝑦 = (2)
𝑡𝑝+𝑓𝑝+𝑡𝑛+𝑓𝑛
𝑡𝑝
𝑝𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛 = (3)
𝑡𝑝+𝑓𝑝
𝑡𝑝
𝑟𝑒𝑐𝑎𝑙𝑙 = (4)
𝑡𝑝+𝑓𝑛
2∗𝑝𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛∗𝑟𝑒𝑐𝑎𝑙𝑙
𝑓1𝑠𝑐𝑜𝑟𝑒 = (5)
𝑝𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛+𝑟𝑒𝑐𝑎𝑙𝑙

IV. RESULTS AND DICUSSIONS


For this experiment, we have used an i7 processor with
GPU (Nvidia RTX 3080 Ti) along with other supporting tools
such as Keras and Tensor flow for the detection and Fig. 4. Sample of Pest Detection Results for the IP102 Dataset.
classification analysis of insect pest images of the IP102
dataset. The performance of the insect detection and The performance indicator for Pest detection is shown in
classification method was implemented on 5, 10 and 15 Fig. 5, such as the two methods' average inferential speed. As
classes of insects. The insect pest images were split into the shown in Fig. 5, Faster R-CNN Efficient Net B7 speed along
ratio of 80% training, 10% validation, and 10 % for testing. with accuracy takes around 19.5 frames per second compared
The proposed Faster R-CNN model is trained using Stochastic to the other model which takes 20.7 frames per second.
Gradient Descent as an optimizer with 0.9 momentum value,
region proposal network weights, and the last fully connected
layer weights. The learning rate tells about the learning
progress and updates with weight parameters to reduce the
loss. The learning rate is varied from 0.0005, 0.0001,
0.001.The maximum no of epochs trained to 40 steps. The
detection and classification results are shown in Fig. 4 based
on Faster RCNN. The proposed Faster R-CNN technique can
correctly detect insect pests in the image and identify the
categories. For all test datasets of pest species, classification
accuracy ranged from 97.0 to 100.00%.

Fig. 5. Speed for Insect Pest Detection based on Faster-RCNN.

415 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022

The model performance for five pest classes based on


Faster R-CNN Efficient Net B7 and Faster R-CNN Efficient
Net B4 model is shown in Fig. 6 and Fig. 7. The learning rate
was reduced by a factor of 0.5 when the improvement during
training went negative. The model continued to be trained
with a stop patience of seven, i.e, if for seven continuous
epochs there was a negative improvement, the training was
halted automatically. The Validation accuracy of around 99.00
% was achieved during training, and the validation loss
decreased progressively up to 0.4 % for the Faster R-CNN
Efficient Net B7 model. Similarly, the Validation accuracy of Fig. 9. Model Performance for 10 Pest Classes based on Faster R-CNN
98.00 % and loss of 0.6 % are obtained for the Faster R-CNN Efficient Net B4.
Efficient Net B4 model.
Similarly, we investigated the model performance for 15
pest classes based on Faster R-CNN Efficient Net B7 and the
Faster R-CNN Efficient Net B4 models as shown in Fig. 10
and Fig. 11. The Validation accuracy of around 93.00 % was
achieved during training and the validation loss decreased
progressively up to 0.67 % for the Faster R-CNN Efficient Net
B7 model. The validation accuracy of 86.00 % and loss of
0.72 % are obtained for the Faster RCNN Efficient Net B4
model.
Fig. 6. Model Performance for 5 Pest Classes based on Faster R-CNN Fig. 12, Fig. 13, and Fig. 14 shows the confusion matrix
Efficient Net B7. for 5, 10, and 15 Pest classes of the IP102 dataset during
testing for the Faster R-CNN Efficient Net B7 model. For
insect pests, aphids 0.006 %, cabbage butterfly 0.019 %,
cicadelliade 0.006 % and flea beetle 0.042 % of images is
incorrectly classified for five pest class. Flax budworm is
correctly classified with a ratio of one for a five pest insect
classification.

Fig. 7. Model Performance for 5 Pest Classes based on Faster R-CNN


Efficient Net B4.

The model performance for 10 pest classes based on Faster


R-CNN Efficient Net B7 and Faster R-CNN Efficient Net B4
model is shown in Fig. 8 and Fig. 9. The Validation accuracy
of around 96.00 % was achieved during training and the
Fig. 10. Model Performance for 15 Pest Classes based on Faster R-CNN
validation loss decreased progressively up to 0.6 % for the Efficient Net B7.
Faster R-CNN Efficient Net B7 model. Similarly, the
validation accuracy of 95.00 % and loss of 0.7 % are obtained
for the Faster R-CNN Efficient Net B4 model.

Fig. 8. Model Performance for 10 Pest Classes based on Faster R-CNN


Efficient Net B7. Fig. 11. Model Performance for 15 Pest Classes based on Faster R-CNN
Efficient Net B4.

416 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022

pest classes based on Faster R-CNN Efficient Net B7. For the
five pest classes, the accuracy ranges from 98 % to 100 %.
The Precision, recall, the F1 score is 99.00 % for 5 classes and
of 10 class pests test data it is around 96.00 % and for 15 class
pests test data is 93.00 %.

Fig. 12. Confusion Matrix for 5 Pest Classes for Faster R-CNN Efficient Net
B7 Model.

Fig. 15. Confusion Matrix for 5 Pest Classes for Faster R-CNN Efficient Net
B4 Model.

Fig. 13. Confusion Matrix for 10 Pest Classes for Faster R-CNN Efficient Net
B7 Model.

Fig. 16. Confusion Matrix for 10 Pest Classes for Faster R-CNN Efficient Net
B4 Model.

Fig. 14. Confusion Matrix for 15 Pest Classes for Faster R-CNN Efficient Net
B7 Model.

Fig. 15 to 17, show the confusion matrix for 5, 10, and 15


Pest classes during testing for the Faster R-CNN Efficient Net
B4 model. For insect pest aphids 0.016 %, cabbage butterfly
0.027 %, cicadellidae 0.006 % and flea beetle 0.084 % of
images is incorrectly classified for a five pest class. Flax
budworm is correctly classified with a ratio of 1 for a 5 pest
insect classification.
Fig. 18, illustrates the classification report for the test
dataset for 5, 10 and 15 pest classes using Faster R-CNN
Efficient Net B7 for IP102 dataset. Classification Accuracy of
99.00 %, 96.00 %, and 93.00 % is achieved for 5, 10, and 15 Fig. 17. Confusion Matrix for 5 Pest Classes for Faster R-CNN Efficient Net
B4 Model.

417 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022

Fig. 18. Classification Report for 5, 10 and 15 Pest Classes for Faster R-CNN Efficient Net B7.

Fig. 19, illustrates the classification report for the test pest images is increased by 70 % to 90 %, the classification
dataset for 5, 10 and 15 pest classes using Faster R-CNN accuracy improves. When compared to 30 % of test data our
Efficient Net B4. Classification Accuracy of 98.00 %, 95.00 Faster R-CNN model has an accuracy of 95.5.%, whereas for
%, and 90.00 % is achieved for 5, 10, and 15 pest classes BP Neural Network and SSD Mobile Net is around 43.5 %
based on Faster R-CNN Efficient Net B4. The Precision, and 85.70 % as shown in Fig. 20.
recall, and F1 score is 98.00 % for 5 classes, 10 class pest is
around 95.00 % and 15 class pests is 90 %. Comparison was done for the other existing methods for 9
and 10-class crop pests as shown in Fig. 21. Among all these
F. Comparative Analysis models, bio-inspired methods achieved an accuracy score of
Y. Liu et al. investigated using Back-propagation Neural 92.50 % [12], with inference we can say that these models
Network for five pest class dataset of IP102 dataset and used deep learning methodology to detect crop pests. Our
achieved an accuracy of 63 %, 50 %, and 43.5 % of proposed Faster R-CNN model outperforms other existing
classification accuracy for 10 %, 20 %, and 30 % test data set methods and achieved an average accuracy score of 96.00 %
[34]. When compared to BP Neural Network, the Single shot for a 10-class crop pest test dataset.
Multi-box detector performed better for identifying the crop The performance of the proposed method Faster R-CNN
pests and achieved an accuracy of 90.6 % for a five pest class Efficient Net B7 and Faster R-CNN Efficient Net B4 is
[35]. Kasinathan et al. proposed a CNN model for five pest compared with existing methods for the IP102 dataset as
classes and obtained an accuracy of 93.9 % [22]. Our Faster- shown in Table I. From Table I we can infer that the proposed
CNN model outperformed when compared with the other two Faster R-CNN Efficient Net B7 method outperforms the latest
models for recognizing the pests and obtained a classification competitive approaches in terms of Accuracy for 5 and 10
accuracy of 99.00 % for 10 % of test data, 98.4 % for 20 % of class crop pests.
test data, and 95.5 % for test data. When the training of the

Fig. 19. Classification Report for 5, 10 and 15 Pest Classes for Faster R-CNN Efficient Net B4.

418 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022

Comparison of 5 Pest classes

10 % Test data 20 % Test data 30 % Test data

120
99 98.4 95.5

Classification Accuracy
100 90.6
86 85.7
80
63
60 50
43.5
40

20

0
BP Neural Network SSD Mobile Net Proposed Faster R-CNN
Pest Classifiers

Fig. 20. Comparison of 5 Pest Classes with Existing Methods.

Comparison for 9 and 10 Pest Classes


100 96
92.5
Classification Accuracy

95 89.6 91.5
90 84.7 85.5
82.4
85
80
75
CapsNet DCNNT MS-CapsNet ResNet50 Bio-inspired CNN Proposed Faster
Model R-CNN
Crop pest classifiers

Fig. 21. Comparison of 10 Pest Classes with Existing Methods.

TABLE I. MODEL COMPARISON ON IP102 DATASET FOR CROP PESTS

Research Technique Accuracy Classes

Y. Liu et.al (2016) [34] BP Neural Network 63.00% 5


W. Liu et.al (2016) [35] SSD Mobile Net 90.60% 5
Iandola et al. (2016) [27] SqueezeNet 67.51% 8
SSD MobileNet 92.12% 8
Ning et al. (2017) [28]
SSD Inception 93.47% 8
Li et al. (2018) [29] CapsNet 82.4% 9
Thenmozhi and Reddy (2019) [18] DCNNT 84.7% 9
Cui et al. (2019) [30] Yolov2 87.66% 8
Wang et al. (2019) [19] MS-CapsNet 89.6% 9
Yan et al. (2020) [31] ResNet50 85.5% 9
Noor et al. (2020) [20] GoogleNet 88.80% 8
Nanni et al. (2020) [12] Bio-inspired Model 92.4% 10
Balakrishnan et al. (2020) [21] Faster-RCNN ResNet50 96.06% 8
91.5% 9
Kasinathan et al. (2021) [22] CNN
93.9% 5
Chen et al. (2022) [32] AlexNet 80.3% 9
Xu et al. (2022) [33] MSCC 92.4% 9
98.00 % 5
95.00 % 10
Faster RCNN Efficient Net B4
90.00 % 15
Proposed
99.00 % 5
Faster RCNN Efficient Net B7
96.00 % 10
93.00 % 15

419 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022

[14] Hieu T. Ung , Huy Q. Ung , Binh T,Nguyen.”An Efficient Insect Pest
V. CONCLUSION Classification Using Multiple, Convolutional Neural Network Based
Models”, arXiv:2107.12189v1 [cs.CV] 26 Jul 2021.
In this study, the investigation was done on the Faster R- [15] ChenH.C, Widodo A.M, Wisnujati A, Rahaman M, Lin J.C.W, Chen L,
CNN method to detect and classify different insect pests for 5, Weng C.E,” AlexNet Convolutional Neural Network for Disease
10, and 15 classes, and the results were compared. To improve Detection and Classification of Tomato Leaf,” Electronics 2022, 11,
the performance and accuracy, each one of the pest images has 951. [CrossRef].
been resized, pre-processed, and augmented to increase the [16] Thanh-Nghi Doan, “An Efficient System for Real-time Mobile Smart
dataset. When the image background is more challenging and Device-based Insect Detection,” International Journal of Advanced
Computer Science and Applications,Vol. 13, No. 6, 2022.
the insect classes are more numerous, as in the IP102 dataset,
our proposed Faster R-CNN Efficient B7 model achieved an [17] Gutierrez A, A. Ansuategi, L. Susperregi, C. Tubío, I. Rankić, and L.
Lenža. 2019,” A benchmarking of learning strategies for pest detection
average classification accuracy of 99.00 %, 96.00 %, and and identification on tomato plants for autonomous scouting robots
93.00 % for 5, 10, and 15 class insect pests outperforming using internal databases,” Journal of Sensors 2019.
other existing models such as SSD Mobile Net, Bio-inspired [18] Thenmozhi K, Reddy,” U.S. Crop Pest Classification Based on Deep
method and Faster R-CNN ResNet 50. In future work, the Convolutional Neural Network and Transfer Learning,” Comput.
proposed Faster R-CNN model will be used for higher number Electron. Agric. 2019, 164, 104906. [CrossRef].
of insect classes and subclasses of insect pests that will be [19] Wang D, Xu Q, Xiao Y, Tang J, Bin L,” Multi-scale Convolutional
useful for farmers to detect insect pests for detection and Capsule Network for Hyperspectral Image Classification,” In Chinese
Conference on Pattern Recognition and Computer Vision; Springer
classification. International Publishing: Cham, Switerland, 2019;pp. 749–760.
REFERENCES [CrossRef].
[1] Xie, C. Zhang. J, Li. R, Li. J, Hong. P, Xia. J, Chen. P, “Automatic [20] Nour Khalifa, N.E, Loey M, Taha M,”Insect Pests Recognition Based
classification for field crop insects via multiple-task sparse on Deep Transfer Learning Models,” J. Theor. Appl. Inf. Technol. 2020,
representation and Multiple kernel learning “Comput. Electron 98, 60–68. [CrossRef].
2015,119, 123–132. [21] Balakrishnan Ramalingam, Mohan R.E, Pookkuttath S, Gómez B.F,
[2] Gaston, K. J, “The Magnitude of Global Insect Species Richness,” Sairam Borusu C.S.C, Wee Teng T, Tamilselvam Y.K,” Remote
Conserv. Biol. 2010, 5, 283–296. Insects Trap Monitoring System Using Deep Learning Framework and
IoT,” Sensors 2020, 20, 5280. [CrossRef].
[3] Siemann E, Tilman. D, Haarstad J,” Insect species diversity, abundance
and body size relationships,” Nature 1996, 380, 704–706. [22] Kasinathan T, Singaraju D, Uyyala S.R, “ Insect Classification and
Detection in Field Crops Using Modern Machine Learning
[4] Zhang H,Huo Q, Ding W, ”The application of AdaBoost-neural network Techniques,” Inf. Process. Agric. 2021, 8, 446–457. [CrossRef].
in stored product insect classification,” In Proceedings of the IEEE
International Symposium on It in Medicine and Education,” Xiamen, [23] Mohammed Esmail Karar, Alsunaydi F, Albusaymi S, Alotaibi S,” A
China, 12–14 December 2009; pp. 973–976. New Mobile Application of Agricultural Pests Recognition Using Deep
Learning in Cloud Computing System,”Alex. Eng. J. 2021, 60, 4423–
[5] Wu X. Zhan C, Lai Y K, Cheng M.M, Yang. J, “IP102: A Large-scale 4432. [CrossRef].
Benchmark Dataset for Insect Pest recognition,” In Proceedings of the
IEEE/CVF Conference on Computer Vision and Pattern Recognition, [24] Mamoru Mimura, ”Impact of benign sample size on binary classification
Long Beach, CA, USA, 15–20 June 2019, pp. 8779–8788. accuracy”, Expert Systems With applications,volume211,January
2022,118630
[6] Xie C, Wang R, Zhang J, Chen P, Li R, Chen T,Chen H,” Multi-level
learning features for automatic classification of field crop pests,” [25] Tang Yu,Wang Chen, Gao Junfeng and Hua Poxi,” Intelligent Detection
Comput. Electron. Agric. 2018, 152, 233–241. Method of Forgings Defects Detection Based on Improved EfficientNet
and Memetic Algorithm” IEEE Access”,Digital Object Identifier
[7] Deng L, “Research on insect pest image detection and recognition based 10.1109/ACCESS.2022.3193676.
on bio-inspired methods,” Biosystems Engineering. Elsevier, 169, pp.
139–148. [26] M. Sokolova, G. Lapalme,” A systematic analysis of performance
measures for classification tasks,” Inf. Process. Manage. 45 (2009) 427–
[8] Johny L Miranda,B Gerado, Bartolome T Tanguilg, “ Pest Detection 437.
and Extraction Using Image Processing Techniques “International
Journal of communication and Engineering,” [27] Iandola F.N, Han S.Moskewicz, M.W, Ashraf K, Dally W.J, Keutzer,
DOI:10.7763/IJCCE.2014.V3.317 Corpus ID: 8891485. K,” SqueezeNet: AlexNet-level Accuracy with 50x Fewer Parameters
and 0.5 MB Model Size,” arxiv 2016, arXiv:1602.07360. [CrossRef]
[9] Faithpraise Fina, Philip Birch, Rupert Young , J. Obu , Bassey
Faithpraise and Chris Chatwin,” Automatic Plant Pest detection and [28] Ning C, Zhou H, Song Y, Tang J, “Inception Single Shot MultiBox
recognition using k-means clustering algorithm and correspondence Detector for Object Detection,” In Proceedings of the 2017 , IEEE
filters, “ International Journal of Advanced Biotechnology and International Conference on Multimedia & Expo Workshops, Hong
Research ISSN 0976-2612, Online ISSN 2278–599X, Vol 4, Issue 2, Kong, China, 10–14 July 2017; pp. 549–554.[CrossRef].
2013, pp 189-199. [29] Li Y, Qian M, Liu P, Cai Q, Li X,Guo J, Yan H, Yu F, Yuan K, Yu J.et
[10] Xi cheng, Youhua Zhang, Yigiong Chen, Yunzhi Wu,Yi Yue,” Pest ,” The Recognition of Rice Images by UAV Based on Capsule
identification via deep residual learning in complex Network,” Clust. Comput. 2018, 22, 9515–9524. [CrossRef].
background”,Computers and Electronics in agriculture,volume [30] Cui J, Zhang J, Sun G, Zheng B,” Extraction and Research of Crop
141, September 2017, Pages 351-356. Feature Points Based on Computer Vision,” Sensors 2019, 19,2553.
[11] M.E. Karar, “Robust RBF neural network–based backstepping controller [CrossRef].
for implantable cardiac pacemakers, “ Int.J. Adapt Control Signal [31] Yan P, Su Y, Tian X,” Classification of Mars Lineament and Non-
Process. 32 (2018) 1040–1051. lineament Structure Based on ResNet50,” In Proceedings of the 2020
[12] Loris Nanni, Gianluca Maguolo, Fabio Pancino,” Insect pest image IEEE International Conference on Advances in Electrical Engineering
detection and recognition based on bio-inspired methods”, Ecological- and Computer Applications, Dalian, China, 25–27 August 2020; pp.
Informatics, https://doi.org/10.1016/j.ecoinf.2020.101089. [CrossRef] 437–441.[CrossRef].
[13] Limiao Deng, Yanjiang Wang, Zhong zhiHan, Renshi Yu, “Research on [32] Chen H.C, Widodo A.M, Wisnujati A, Rahaman M, Lin J.C.W, Chen
insect pest image detection and recognition based on bio-inspired- L, Weng C.E,”AlexNet Convolutional Neural Network for Disease
methods,” https://www.sciencedirect.com/journal/biosystems- Detection and Classification of Tomato Leaf,” Electronics 2022, 11,
engineering. 951. [CrossRef].

420 | P a g e
www.ijacsa.thesai.org
(IJACSA) International Journal of Advanced Computer Science and Applications,
Vol. 13, No. 9, 2022

[33] Xu C,Yu C, Zhang S, Wang X,” Multi-Scale Convolution-Capsule [39] Mingxing Tan, Quoc V. Le,”EfficientNet: Rethinking Model Scaling for
Network for Crop Insect Pest Recognition,” Electronics 2022,11, 1630. convolutional Neural Networks,” Proceedings of the 36 th International
[CrossRef]. Conference on Machine, Learning, Long Beach,California, PMLR 97,
[34] Y. Liu, W. Jing, L. Xu, “Parallelizing Backpropagation Neural Network 2019.
Using MapReduce and Cascading model,” Comput. Intell. Neurosci. [40] Sabanci, K., “Detection of sunn pest‐damaged wheat grains using
2016 (2016) 2842780. [CrossRef]. artificial bee colony optimization‐based artificial intelligence
[35] W. Liu, D. Anguelov, D. Erhan, C. Szegedy, S. Reed, C.-Y. Fu, A.C. techniques”, Journal of the Science of Food and Agriculture, 100(2),
Berg, “SSD: Single Shot MultiBox Detector, “in: B.Leibe, J. Matas, N. pp.817-824.
Sebe, M. Welling (Eds.), Computer Vision – ECCV 2016, Springer [41] Shaohua Wan, SotiriosGoudos,"Faster R-CNN for multi-class fruit
International Publishing, Cham, 2016, pp. 21–37.[CrossRef]. detection using a robotic vision system",Computer Networks,
[36] A.G. Howard, M. Zhu, B. Chen, D. Kalenichenko, W. Wang, T. vol.168,February 2020.
Weyand, M. Andreetto, H. Adam, “MobileNets: Efficient [42] DengshanLi ,Rujing Wang, ChengjunXie , Liu Liu , Jie Zhang, Rui Li,
Convolutional Neural Networks for Mobile Vision Applications,” Fangyuan Wang, Man Zhou and Wancai Liu, ”A Recognition Method
Computer vision and Pattern Recognition arXiV: 1704.4861 (2017). for Rice Plant Diseases and Pests Video Detection Based on Deep
[37] R. Girshick, ‘‘Fast R-CNN,’’ in Proc. IEEE Int. Conf. Comput. Vis. Convolutional Neural Network”. Sensors 2020, 20, 578;
(ICCV), Santiago, Chile, Dec. 2015, pp. 14 40–1448. doi: doi:10.3390/s20030578.
10.1109/ICCV.2015.169. [43] Ngugi, L.C, Abelwahab, M, Abo-Zahhad, M. “Recent advances in
[38] G. Huang, Z. Liu, L. Van Der Maaten, K.Q. Weinberger,” Densely image processing techniques for automated leaf pest and disease
connected convolutional networks, Proceedings – 30th IEEE Conference recognition “A review. Inf. Process. Agric. 2021, 8, 27–51.
on Computer Vision and Pattern Recognition, “ CVPR 2017, Institute of [44] Alves A. N., Souza W. S. R., Borges D. L. “Cotton pests classification
Electrical and Electronics Engineers Inc., 2017, pp. 2261–2269. in field-based images using deep residual networks”, Computers and
Electronics in Agriculture 2020;174.

421 | P a g e
www.ijacsa.thesai.org

You might also like