Segmentation and Detection of Crop Pests using Novel U‐Net with Hybrid Deep Learning Mechanism
Nagaveni Biradar, Girisha Hosalli - Insect Science
- Agronomy and Crop Science
- General Medicine
Abstract
In India, agriculture is the backbone of economic sectors because of the increasing demand for agricultural products. However, agricultural production has been affected due to the presence of pests in the crops. Several methods were developed to solve the crop pest detection issue, but they failed to achieve better results. Therefore, the proposed study used a new hybrid deep learning mechanism for segmenting and detecting pests in crops. Image collection, pre‐processing, segmentation, and detection are the steps involved in the proposed study. There are three steps involved in pre‐processing: image rescaling, equalized joint histogram based contrast enhancement (Eq‐JH‐CE), and bendlet transform based De‐noising (BT‐D). Next, the pre‐processed images are segmented using the DenseNet‐77 UNet model. In this section, the complexity of the conventional UNet model is mitigated by hybridizing it with the DenseNet‐77 model. Once the segmentation is done with an improved model, the crop pests are detected and classified by proposing a novel Convolutional Slice‐Attention based Gated Recurrent Unit (CS‐AGRU) model. The proposed model is the combination of a convolutional Neural Network (CNN) and a Gated Recurrent Unit (GRU). In order to achieve better accuracy outcomes, the proposed study hybridized these models due to their great efficiency. Also, the slice attention mechanism is applied over the proposed model for fetching relevant feature information and thereby enhancing the computational efficiency. So, pests in the crop are finally detected using the proposed method. The Python programming language is utilized for implementation. The proposed approach shows a better accuracy range of 99.52%, IoU of 99.1%, precision of 98.88%, recall of 99.53%, F1‐score of 99.35%, and FNR of 0.011 compared to existing techniques.
This article is protected by copyright. All rights reserved.