An Efficient System For Real-time Mobile Smart Device-based Insect .

Transcription

(IJACSA) International Journal of Advanced Computer Science and Applications,Vol. 13, No. 6, 2022An Efficient System for Real-time Mobile SmartDevice-based Insect DetectionThanh-Nghi DoanFaculty of Information Technology, An Giang UniversityVietnam National University Ho Chi Minh City, An Giang, VietnamAbstract—In recent years, the rapid development of manypests and diseases has caused heavy damage to the agriculturalproduction of many countries. However, it is difficult for farmersto accurately identify each type of insect pest, and yet they haveused a large number of pesticides indiscriminately, causingserious environmental pollution. Meanwhile, spraying pesticidesis very expensive, and thus developing a system to identify cropdamaging pests early will help farmers save a lot of money whilealso contributing to the development of sustainable agriculture.This paper presents a new efficient deep learning system for realtime insect image recognition on mobile devices. Our systemachieved an accuracy of mAP@0.5 with the YOLOv5-S model of70.5% on the 10 insect dataset and 42.9% on the IP102 largescale insect dataset. In addition, our system can provide moreinformation to farmers about insects such as biologicalcharacteristics, distribution, morphology, and pest controlmeasures. From there, farmers can take appropriate measures toprevent pests and diseases, helping reduce production costs andprotecting the environment.use, and be appropriate for farmers' level of knowledge andactual working conditions, where each farmer typically has asmartphone with a basic configuration. Therefore, an automaticsystem to identify pests on plants using inexpensive smartphones must be developed and deployed. The primary goal isto efficiently detect insects in real-time manner, providingfarmers with greater convenience and mobility in early pesttreatment. Although smartphones have penetrated a variety ofindustries, including manufacturing, medicine, and health care,use of mobile devices in agriculture has been slower. Farmersunderstand the need for mobile agriculture as technologyadvances, which not only allow farmers to execute agriculturalactivities more effectively using their phones, but alsotransform arable farming into smart agriculture. In thisresearch, a real-time insect object detection system is built inthe context of large-scale insect pest datasets. Our system isbased on the YOLOv5-S model and has been integrated ontomobile devices with limited hardware configurations, making itideal for farmers in the field.Keywords—Deep learning; real-time insect pest detection;YOLOv5; mobile devicesI.INTRODUCTIONClimate change has caused pests to multiply, grow quickly,and cause significant damage to the world's agriculturaleconomy [1]. Pests are estimated to cost up to 40% ofworldwide agricultural output each year, according to the Foodand Agriculture Organization. At present, plant diseases costthe global economy almost 220 billion each year, whileinvading insects cost at least 70 billion [2]. Therefore, farmersin many countries have used a large number of differentpesticides to protect crops and ensure the quality of agriculturalproducts. However, due to a lack of specialized knowledge,many farmers have difficulty detecting and correctlyidentifying pests and diseases that cause crop damage. As aresult, most farmers did not have reasonable pest controlmeasures, including the indiscriminate and improper use of alarge number of pesticides on a large scale. This not onlyincreases production costs but also seriously pollutes theenvironment, destroys beneficial insects, disrupts ecosystembalance, and damages the health and living environment ofhumans and many other species. As a result, it is critical toresearch information technology systems in order to accurately,efficiently, quickly, and conveniently identify pests anddiseases that harm crops. This system will aid in the resolutionof the aforementioned issues, thereby contributing significantlyto long-term agricultural development. Such a system must bedesigned for real-time identification, be simple to install andII. BACKGROUND STUDYMuch of the prior research has presented real-time imagebased recognition systems for mobile devices based on variousCNN architectures. To recognize leaves from images, theauthors of [3] have developed a novel extraction andclassification technique. The insect population and illnessregions in the segmented images are then calculated using aregion-labeling technique. A mathematical morphologicalalgorithm is utilized to separate the items in the zones ofadhesion. The proposed solution is tested in the field anddeployed on mobile smart devices. The experimental findingsreveal that the suggested technique has high efficiency andstrong recognition performance. The authors of [4] havecreated a pest infestation early warning system for paddyfarming that includes an Android application and a web-basedapplication. The Agriculture Department will use thetechnology to identify insect infestations, locate them, and alertthe early warning system. The technology will be able to enterthe farmers' infestation data into databases. The data will beutilized by the agronomist to assess the paddy plot's risk in fourstages. The number of pests, kind of pest, location, and presentcircumstances will be used to classify each stage. After theagronomist has completed their review, the system will send anemail to the farmers informing them of the quality of theircurrent paddy plot. The researchers from [5] suggested animage processing technique and a smartphone application torecognize and count insects. The nonuniform brightness ofinsect images obtained with mobile phones is released using a30 P a g ewww.ijacsa.thesai.org

(IJACSA) International Journal of Advanced Computer Science and Applications,Vol. 13, No. 6, 2022sliding window-based binarization, and then connecteddomain-based histogram statistics are utilized to identify andcount the insects in stored grain. Finally, testing using anAndroid application shows that the proposed technique cancount random bug photographs from mobile phones with 95%accuracy, which is superior to the previous method. In [6],MAESTRO, a novel grasshopper identification framework thatemploys deep learning to recognize insects in RBG pictures, isdemonstrated. MAESTRO uses a state-of-the-art two-stagedeep learning training approach. The framework may be usedon cellphones as well as desktop PCs. The authors of [7] offeran AI-based pest detection system that addresses the challengeof identifying scale pests using photos. Scale pests are detectedand localized in the image using deep-learning-based objectidentification models such as faster region-based convolutionalnetworks, single-shot multibox detectors, and YOLOv4.Among the algorithms, YOLOv4 had the highest classificationaccuracy, with 100% in mealybugs, 89% in Coccidae, and 97%in Diaspididae. A smartphone application based on the trainedscale insect detection model has been developed to assistfarmers in identifying pests and administering appropriatepesticides to reduce crop losses. The researchers at [8] havestudied the best machine learning approach for developing apest detection model for mobile information systems. Thearticle [9] proposed a novel smartphone application that uses adeep-learning method to automatically categorize pests for thebenefit of professionals and farmers. Faster R-CNN is used inthe created application to do insect pest recognition using cloudcomputing. To assist farmers, a database of suggestedpesticides is linked to the reported crop pests. This research hasbeen validated for five distinct pest species. The suggestedFaster R-CNN had the greatest accuracy in identification rateof 99% for all pest images analyzed. The study [10] provided anovel method for establishing the use of hand-held imagecapture of insect traps for pest detection in vineyards byembedding artificial intelligence into mobile devices. Theirsolution integrates many computer vision technologies toenhance numerous areas of picture quality andappropriateness. The extensive review [11] examines deeplearning framework methodologies and applications in smartpest monitoring, with a focus on insect pest categorization anddetection using field photos. The methodology and technicalinformation created in insect pest classification and detectionusing deep learning are consolidated and distilled duringmultiple processing stages: picture collection, datapreprocessing, and modeling strategies. Finally, a genericframework for smart insect monitoring is proposed, and futurechallenges and trends are discussed. In AlertTrap [12], SSDarchitecture implementation with different cutting-edgebackbone feature extractors, such as MobileNetV1 andMobileNetV2, appears to be a viable solution to the real-timedetection problem. SSD-MobileNetV1 and SSD-MobileNetV2work well, with AP@0.5 rates of 0.957 and 1.0, respectively.YOLOv4-tiny surpasses the SSD family in AP@0.5 with 1.0;nevertheless, its throughput velocity is significantly slower,indicating that SSD models are better candidates for real-timeimplementation. They also ran the models via synthetic testsets that simulated predicted environmental disruptions. TheYOLOv4-tiny tolerated these disruptions better than the SSDvariants. By combining EfficientNet [13] and Power meanSVM [14], the authors of the research [15] published the stateof the art on insect image classification on the large-scaleIP102 dataset with an accuracy of up to 71.84%. However, theabovementioned systems still have some limitations, such asthe small number of pest identifications; the accuracy is nothigh; the equipment configuration requirements are high; and itis difficult to deploy in practice. They lack aspects such asgeolocation recoding of recognized harmful pests, informationabout identified dangerous pests, and robust distributed mobileinformation frameworks. Currently, there is no real-timeexisting identification system for mobile devices. Therefore,this paper proposes a new real-time insect identification systemwith reasonable cost, efficiency, easy installation, and practicaldeployment on mobile devices with limited hardwareconfiguration. Furthermore, this study also looks at lightweightnetwork models and embedded terminal realizations, both ofwhich are increasingly relevant and promising. The paper'smain contributions are as follows: A novel real-time insect identification system that isideal for mobile devices with restricted hardwareconfiguration, easy to install, inexpensive, and userfriendly. The most current identification results using YOLOv5S from the large-scale dataset IP102 are presented. A new system captures images and uses GPS todetermine the distribution of insects in the field. Thiscontributes to the development of a large insectdatabase and insect distribution maps.The rest of the article is arranged as follows. Section IIIdescribes the materials and methods used to evaluate ourapproach, including an overview of our system, the YOLOv5model, and the pest insect image datasets. The experimentalresults and discussion are reported in Section IV. Section Vpresents the conclusions, limitations, and recommendations forfuture research.III. MATERIALS AND METHODSA. Overview of our SystemAn overview of our real-time insect identification system isshown in Fig. 1. Users can first use their mobile phones tophotograph insects in a real-time manner, or they can use insectphotographs found on the internet or images captured by bugtraps. The YOLOv5-S model, which is already embedded intothe mobile application, then identifies the insect image in realtime, resulting in a very quick insect identification time. Whenan insect image is properly identified, the system will providethe user with detailed information on the insect, such as itsname, biological characteristics, distribution, morphology, andcontrol strategies. Our new insect recognition system can workin both online and offline mode. In the online mode, the insectidentification information is sent to the Web server, which thenprocesses and returns detailed insect information in JSONformat [16]. Insect information can be viewed alongsidesimilar images in the data warehouse. The user can also see alist of all insects, complete with detailed information andimages. Users can upload insect images and shooting locationsto update the data warehouse at the same time in this mode.31 P a g ewww.ijacsa.thesai.org

(IJACSA) International Journal of Advanced Computer Science and Applications,Vol. 13, No. 6, 2022The entire database will be stored on the server in the onlinemode, making it suitable for mobile devices with limitedhardware configuration and ensuring that information is alwaysup-to-date. The application's speed, however, is determined bythe available network bandwidth. In the offline mode, SQLite[17], a C-language package that creates a compact, fast, selfcontained, high-reliability, full-featured SQL database engine,is used for storing insect information data on mobile devices.This mode will be very useful in cases where farmers' workingenvironments do not have internet, such as in the fields farfrom urban areas, where internet, 4G, and 5G coverage are notyet available. However, in this mode, some applicationfunctions will be restricted.when compared to other YOLO models, the indicators ofmAPval@0.5 and the speed of the YOLOv5-S model inTable IV and Table V are quite excellent.B. YOLOV5YOLOv5 [18] is a single-stage object detection system. Inone-stage object identification approaches, object detection isconsidered as a regression issue. It estimates the classprobability and the coordinates of the bounding box that willcontain the object in a single step on the input picture. Thebackbone, neck, and head are the three main components.YOLO is another name for the head layer. The modelbackbone's duty is to draw attention to the image's uniquefeatures. In YOLOv5, the model backbone is a CSPNet [19]structure. The CSPNet approach divides the feature map in thebase layer into two parts; some reach the transition layerthrough the dense block, while the other half is directlyintegrated with the transition layer. This not only reducesmodel size but also increases inference speed [20]. In thisstudy, the YOLOv5-S model is used to develop applications onmobile devices due to its small size and model parameters,GFLOPs calculation speed and high accuracy, and lack ofrequirement for high hardware configuration when comparedto other YOLO models such as YOLOv4 [21], YOLOX [22].As shown in Table I, the YOLOv5-S model is relatively smallin size, with a network parameter of 7.3M and a disk size of14.2 MB, making it suitable for mobile devices with limitedhardware configuration. With a GFLOPs index of 17.1, thecalculating speed of the YOLOv5-S is adequate. Furthermore,TABLE I.NETWORK PARAMETERS OF YOLO MODELSModelsParams (M)Size on disk 9.1757.0281.9C. DatasetsTo create the insect pest database for machine learningmodels, 2,335 photos of 10 distinct pest kinds were collectedfrom internet data sources, as shown in Fig. 2. The dataset wasthen split into the following proportions: 70% of the sampleswere utilized for training, 20% for model evaluation, and theremainder for testing. As a consequence, the result dataset has1634 images for training, 467 images for validation, and 234images for testing, as shown in Table II. The LabelImgprogram [23] is utilized to manually label the insect objects andgenerate the .xml file containing object position information,which is then transformed into the .txt file that YOLOv5 canread. Because the IP102 data set has some constraints, such asthe same class with numerous different insect stages such aslarvae, caterpillars, and moths, achieving high identificationefficiency is challenging. Therefore, the YOLOv5-S model wastested with 10 insect classes that were gathered by theagriculture expert volunteers.Fig. 1. Overview of our Real-time Insect Image Recognition System by Mobile Devices.32 P a g ewww.ijacsa.thesai.org

(IJACSA) International Journal of Advanced Computer Science and Applications,Vol. 13, No. 6, 2022IV. RESULT AND DISCUSSIONFig. 2. Some Images of Insect Samples in the Insect10 Dataset.TABLE II.THE NUMBER OF IMAGES IN THE INSECT10 DATASETS WITH 10INSECT SPECIESNoInsect nameTrainValidationTest1Acalymma vittatum11633172Achatina fulica25874373Alticini19355284Asparagus beetles8925135Aulacophora similis11332166Cerotoma trifurcata8625127Dermaptera11132168Leptinotarsa decemlineata23467339Mantodea185532610Squash bug2497136Total1634467234In this paper, the new system was also evaluated on largescale insect image datasets. However, collecting a large-scaleinsect pest image dataset is difficult due to the fact that,depending on the species and kind of insect pest, all insectpests go through several phases during their lifecycle. As aresult, the insect pest pictures from the publicly available IP102dataset [24] are used for evaluating the system. It comprisesalmost 75,000 photos from 102 agricultural insect pestcategories. The IP102 collection includes 75,222 photos and102 insect pest classifications, while the smallest categorycomprises just 71 samples. There are 18,983 annotated photosfor the job of object detection. As in [24], the images withbounding box annotations were divided into training andtesting sets of 15,178 and 3,798 images, respectively. Somesample images of the IP102 dataset are shown in Fig. 3.A. Experimental Setup and TrainingAll YOLO model training experiments were carried out onGoogle Colab using a Tesla K80 24 GB GPU. Algorithms arewritten in the Python and Keras programming languages. Totrain the models, the experimental setup is as follows: alearning rate of 0.01, an image size of 640 pixels, a batch sizeof 16, and 150 epochs for YOLOv5, YOLOX, and 2,000epochs for YOLOv4. The Stochastic Gradient Descent [25] isused as the optimization algorithm. Devices with lowconfiguration are utilized to conduct tests on mobile devices, asindicated in Table III.TABLE III.SMARTPHONE DEVICE CONFIGURATION AND APPLICATIONDEVELOPMENT ENVIRONMENTSmartphone hardwareconfigurationThe Samsung Galaxy A30 is powered by aSamsung Exynos 7 Octa 7904 processor withMHZ and 8 cores. The powerful processor and3000.0 MB of RAM give incredible performance,ensuring trouble-free operation of even the mostcomplex program or game. The Samsung GalaxyA30 uses a microSDXC memory card. The phonecarries over the 15.93-megapixel rear camerasensor at the back of the device. The front cameraof the Samsung has 15.93. It gives us very highquality photos and videos with a great camerainterface. The device has a 6.4-inch SUPERAMOLED display. It gives a decent displayquality and a great gradation between warm andcold colors. The OS is Android 10.Programinng language to Programing language: Java, Developmentbuild applicationsEnvironment: Android StudioThe lightNormal luster intensityB. Evaluation MetricsMean Average Precision (mAP) is a popular metric forassessing the performance of object detecting systems. ThemAP computes a score by comparing the ground-truthbounding box to the detected box. The higher the score, themore precise is the model's detections. The mAP formula isbased on the following sub metrics: Confusion Matrix,Intersection over Union (IoU), Recall, Precision. To create aconfusion matrix, the experiments present four attributes: TruePositives (TP): The model predicted a label and matched itcorrectly as per ground truth. True Negatives (TN): The modeldoes not predict the label and is not a part of the ground truth.False Positives (FP): The model predicted a label, but it is not apart of the ground truth. False Negatives (FN): The model doesnot predict a label, but it is part of the ground truth.In Equation (1), IoU denotes the overlap of anticipatedbounding box coordinates with ground truth box coordinates. Itexplains how an object identification algorithm createsprediction scores. The definition of IoU is described in Fig. 4.Higher IoU implies that the anticipated bounding boxcoordinates are similar to the ground truth box coordinates.Fig. 3. Some Images of Insect Samples in the IP102 Dataset.𝐼𝑂𝑈 area of overlap area of union(1)33 P a g ewww.ijacsa.thesai.org

(IJACSA) International Journal of Advanced Computer Science and Applications,Vol. 13, No. 6, 2022Fig. 4. IoU Definition.In Equation (2), Precision refers to how successfully youcan identify true positives (TP) from all positive predictions. InEquation (3), Recall measures how well you can find truepositives (TP) out of all predictions (TP FN).𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛 𝑅𝑒𝑐𝑎𝑙𝑙 𝑇𝑃𝑇𝑃 𝐹𝑃𝑇𝑃𝑇𝑃 𝐹𝑁(2)(3)In Equation (4), Average Precision is calculated as theweighted mean of precision at each threshold; the weight is theincrease in recall from the prior threshold. In Equation (5),Mean Average Precision is the average of the AP of each class.However, the interpretation of AP and mAP varies in differentcontexts. On the validation datasets, the mAPval@0.5 means theaverage mAP with IoU thresholds over 0.5. ThemAPval@0.5:0.95 means average mAP over different IoUthresholds, from 0.5 to 0.95, step 0.05.𝐴𝑃 𝑘 𝑛 1𝑘 0 [𝑅𝑒𝑐𝑎𝑙𝑙(𝑘) 𝑅𝑒𝑐𝑎𝑙𝑙(𝑘 1)] 𝑃𝑟𝑒𝑐𝑖𝑠𝑖𝑜𝑛(𝑘) (4)𝑛 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 � 𝑘 𝑛𝑘 1 𝐴𝑃𝑘𝑛(2)𝐴𝑃𝑘 𝑡ℎ𝑒 𝐴𝑃 𝑜𝑓 𝑐𝑙𝑎𝑠𝑠 𝑘, 𝑎𝑛𝑑 𝑛 𝑡ℎ𝑒 𝑛𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑐𝑙𝑎𝑠𝑠𝑒𝑠C. Experimental Results and DiscussionThe experiment was conducted to analyze the backbone 0.95 metrics as a result of the training.Table IV and Fig. 5 show the results of four different modelvariations on the Insect10 dataset. On the Insect10 dataset, thenumerical results in Fig. 5 demonstrate that the new mobileapplication has a relatively high success rate in precision, recalland mAP for pest object recognition. For instance, thedetection performance of the Alcalymma insect has the lowestmAP@IoU:0.5 identification accuracy of 0.45, while thedetection performance of Leptinotarsa has the highest at0.979. Our application is based on the YOLOv5-S model,which was trained on Insect10 datasets with 10 different insectspecies. The actual results show that, when compared to otherobject detection methods, YOLO has a faster recognition speedand can almost identify objects in real-time manner. Fig. 7shows some examples of successful insect recognition onmobile devices using the Insect10 datasets.Our approach has also been evaluated on the large-scaledataset IP102 [24] to see how well it scales on these datasets.As shown in Table V and Fig. 6, our system has achieved apromising performance of mAPval@0.5 accuracy of 42.9% withthe YOLOv5-S model. This result shows that the new approachoutperforms several previous approaches that were reported in[24]. However, insect object detection was still morechallenging using the IP102 dataset. The reason is that theinsect pests in the image are difficult to detect due to their colorappearance and the image backgrounds are very similar. Inaddition, the morphology of an insect pest issue, such as amoth, can vary substantially as it develops. Fig. 8 depicts someimages of successful insect recognition using the IP102 dataseton a mobile device. This indicates that our approach offersseveral benefits over existing methods, including the ability tohandle massive data sets with excellent accuracy. Moreover,this new system may also be implemented on low-cost mobiledevices with minimal hardware configuration. In addition, asillustrated in Fig. 9, the usage of matching pesticides isintegrated with the pest categorization findings to adviseprofessionals and farmers. In the near future, this system willbe implemented on new devices like the NVIDIA Jetson NanoDeveloper Kit [26], which have a higher hardwareconfiguration, a lower cost, a smaller footprint, and a betterlevel of durability.TABLE IV.SIMULATION RESULTS OF YOLOV4, YOLOV5, AND YOLOXMODELS ON THE INSECT10 8.3YOLOv5-SDarknet-5370.535.9YOLOv5-MModified CSP v576.642.7YOLOv5-LModified CSP v578.946.8YOLOv5-XModified CSP v573.040.9YOLOX-SDarknet-5384.858.5YOLOX-MModified CSP v582.361.9YOLOX-LModified CSP v584.065.0YOLOX-XModified CSP v583.064.0Fig. 5. Precision and Recall of Insect Recognition Results on the Insect10Dataset using the YOLOv5-S Model.34 P a g ewww.ijacsa.thesai.org

(IJACSA) International Journal of Advanced Computer Science and Applications,Vol. 13, No. 6, 2022TABLE V.SIMULATION RESULTS OF YOLOV4, YOLOV5, AND YOLOXMODELS ON THE IP102 19.0YOLOv5-SDarknet-5342.924.0YOLOv5-MModified CSP v547.427.9YOLOv5-LModified CSP v550.129.9YOLOv5-XModified CSP v554.032.5YOLOX-SDarknet-5352.334.1YOLOX-MModified CSP v554.235.1YOLOX-LModified CSP v553.934.7YOLOX-XModified CSP v554.134.9Fig. 9. The user Interface Screen shows the Successful Insect Recognitionand Detailed Insect Information on a Mobile Device.Fig. 6. Precision and Recall of Insect Recognition Results on the IP102Dataset using the YOLOv5-S Model.Fig. 10. The Insect Distribution Map was constructed based on GPS LocationInformation from the user's Insect Photos.V. CONCLUSION AND FUTURE RESEARCH WORKFig. 7. Some Images were Successfully Detected on Mobile Devices usingthe Insect10 Dataset.Fig. 8. Some Images were Successfully Detected on Mobile Devices usingthe IP102 Dataset.The information on insect GPS location and density will beextremely useful for several Integrated Pest Managementsystems. Therefore, our systems are designed to allow users toautomatically record this information. Then, a real-time insectdistribution density map is created using this data, as illustratedin Fig. 10. This map will assist expert users in tracking andforecasting the density and evolution of insect infections overlarge areas. At the same time, it is possible to evaluate thepotential effects of insect pests on agriculture and ecosystemproduction.This paper presents an efficient system for real-time mobilesmart device-based insect detection. Our system was developedbased on the YOLOv5-S model because of its lightweightconvolutional neural network and is thus suitable for mobiledevices with limited hardware configuration. Moreover, insectpest detection and classification may be incorporated intohardware that farmers can utilize across a wide range ofsituations to safeguard their farms from pests. Therefore, ourmethod has numerous advantages in terms of real-time insectidentification, low cost, simple implementation, and practicalimplementation. The numerical results showed that the newsystem achieved 70.5% classification accuracy with mAP@0.5on the Insect10 dataset and 42.9% accuracy with the largedataset IP102. This is the best insect pest detection result withYOLOv5-S ever reported from the largest insect dataset,IP102. However, these mAP accuracy results are still low whencompared to the accuracy required for actual insect detectionfor agricultural production. Consequently, the next task will beto investigate more efficient recognition models in order toimprove the accuracy and number of insects. Simultaneously,this work will be continued to study on better mobile devices,such as the NVIDIA Jetson Nano Developer Kit, which has acentral processing unit, a graphical processing unit, a webcamera, and currently only a low charge, allowing largerconvolutional neural network models to be installed.35 P a g ewww.ijacsa.thesai.org

(IJACSA) International Journal of Advanced Computer Science and Applications,Vol. 13, No. 6, 2022ACKNOWLEDGMENTThis study was funded by the National Geographic SocietyExploration Grants (NGS-KOR-59552T-19), Microsoft AI forEarth, and the support of agriculture experts from An GiangUniversity and Vietnam National University in Ho Chi MinhCity, Vietnam.REFERENCESS. Skendžić, M. Zovko, I. P. Živković, V. Lešić, and D. Lemić, Theimpact of climate change on agricultural insect pests, vol. 12, no. 5.2021.[2] “New standards to curb the global spread of plant pests and diseases.”Food and Agriculture Organization of the United Nations (FAO), ory/en/item/1187738/icode/. [Last Access: 1-07-2020].[3] K. Wang, Z. Shuifa, Z. Wang, Z. Liu, and F. Yang, “Mobile smartdevice-based vegetable disease and insect pest recognition :10.1080/10798587.2013.823783.[4] H. Nasir, A. N. Aris, A. Lajis, K. Kadir, and S. I. Safie, “Developmentof Android Application for Pest Infestation Early Warning System,” in2018 IEEE 5th International Conference on Smart Instrumentation,Measurement and Application (ICSIMA), 2018, pp. 1–5, doi:10.1109/ICSIMA.2018.8688774.[5] C. Zhu, J. Wang, H. Liu, and H. Mi, “Insect Identification and Countingin Stored Grain: Image Processing Approach and Application Embeddedin Smartphones,” Mob. Inf. Syst., vol. 2018, no. ii, 2018, doi:10.1155/2018/5491706.[6] P. Chudzik et al., “Mobile Real-Time Grasshopper Detection and DataAggregation Framework,” Sci. Rep., vol. 10, no. 1, p. 1150, 2020, doi:10.1038/s41598-020-57674-8.[7] J. W. Chen, W. J. Lin, H. J. Cheng, C. L. Hung, C. Y. Lin, and S. P.Chen, “A smartphone-based application for scale pest detection usingmultiple-object detection methods,” Electron., vol. 10, no. 4, pp. 1–14,2021, doi: 10.3390/electronics10040372.[8] S. A. Lakmal Perera, “Pest Detecting Mobile Information System,”2021.[9] M. E. Karar, F. Alsunaydi, S. Albusaymi, and S. Alotaibi, “A newmobile application of agricultural pests recognition using deep learningin cloud computing system,” Alexandria Eng. J., vol. 60, no. 5, pp.4423–4432, 2021, doi: 10.1016/j.aej.2021.03.

Users can first use their mobile phones to photograph insects in a real-time manner, or they can use insect photographs found on the internet or images captured by bug traps. The YOLOv5-S model, which is already embedded into the mobile application, then identifies the insect image in real time, resulting in a very quick insect identification time.