基于改进Mask-RCNN算法的作物害虫分类识别

    Classification and identification of crop pests using improved Mask-RCNN algorithm

    • 摘要: 智能虫情测报灯对农业生产中及时察觉虫害、虫灾问题有重大作用,准确的害虫分类识别为虫情测报提供可靠数据支撑的关键。该研究对智能虫情测报灯所需核心识别算法进行改进,针对分类目标多尺度、存在多种相似非目标害虫干扰、易产生目标粘连等问题,提出一种基于改进Mask-RCNN(mask region-based convolutional neural network)模型的害虫图像智能识别模型。该模型使用DeAnchor算法改进Mask-RCNN的锚框引导机制,使用NDCC(novelty detection consistent classifiers)训练分类器进行联合分类和检测,改善非目标杂虫的误识别问题。改进后模型对无杂虫、不同虫体密度图像的识别准确率最高达到96.1%,最密集时可达90.6%,在仅有非目标的图片识别中,误检率降至9%,非目标与目标共存且密度为40虫/图的误检率降至15%。试验表明,该文所提模型在现有分类模型的基础上,增强了对密集区域的检测能力,改善了非目标误识别问题,在实际检测环境下的害虫分类识别精度更高,可为虫害防治工作提供数据参考。

       

      Abstract: Intelligent insect detection and alarm lights can timely monitor the pests in the intelligent production of smart agriculture at present. However, some pest alarm lights cannot provide reliable data support for pest detection and reporting tasks, due mainly to the low accuracy of detection and identification. This study aims to improve the classification and recognition to fully meet the requirement of pest situation lamps using deep learning. The mask RCNN instance segmentation model was selected as the basic detection framework. A network model was constructed to recognize the multiple similar graphs using ResNet50-FPN convolutional neural network (CNN). According to Faster RCNN, a mask branch was added for the instance segmentation. An anchor frame was used for the classification and regression. The pixel segmentation and classification were added for a more accurate classification. The framework of object detection was obtained with the best detection accuracy in fine-grained classification. Nevertheless, the main difficulty of the model was the nonuniform size of the classification target. Much more interferences of similar insects were easy to produce the target adhesion, due to the high density. Mask RCNN model relied normally on the dense anchoring scheme, the predefined scale and the aspect ratio to sample uniformly. The improved predefined anchor frame cannot solve the adhesion of targets at various scales. Moreover, the difference between non-targets and targets cannot be distinguished, although there was a distinction among different subclasses of the same category. Five datasets of target pests were established in this case. A small proportion of the miscellaneous insect dataset was added to reduce the fraction of miscellaneous insects. In adhesion, soft NMS was used to replace NMS, and DeAnchor was adopted to learn each target area. The detection frame was then fitted to the target during model prediction and recognition. The detection frame was further reduced to exceed the area or expose the surroundings during prediction. The adhesion and recognition rate were greatly improved after three steps. In non-target misidentification, the NDCC was added to exclude the unknown miscellaneous insects using the novelty score. The reason was that there were many more types of non-targets similar to targets, leading to the high misidentification rate of non-targets. The optimal model was achieved with the highest recognition accuracy of 96.1% for the multi-insect pictures with 10 insects per picture, and 93.5% for multi-insect pictures with 20 insects per picture. The accuracy rate of multi-insect image recognition reached more than 90.6% for the identification number of 50 insects per image. The adhesion was basically solved in the larger adhesion target in the picture. Non-target recognition and misrecognition were improved in the pictures containing no-target insects. Moreover, the error of detection rate reached the lowest 9% in the recognition of 20 pure non-target images. The error of detection rate was also reduced to less than 15% in the recognition of 40 mixed target images. The functions of intensive and novelty detection were added to the existing classification model, in order to enhance the detection ability of dense areas. The non-target misidentification was improved for the higher classification and recognition accuracy of pest classification in the actual environment.

       

    /

    返回文章
    返回