Abstract:
The increase of pheasants has posed a threaten to crops as the advancement of ecology. However, most conventional methods of bird repellent have inherent deficiencies in terms of efficiency and danger. An efficiency monitoring method for pheasant is necessary to combine with artificial intelligence, in order to provide early warning and expulsion of pheasants. Normally, pheasant activities are mostly in the early morning and dusk under complex environment with protective color or habit of hiding. This behavior has made monitoring methods much more challenge. In this paper, a novel recognition method for pheasant has been proposed on the deployment of embedded computing platform, combined with the enhanced Tiny-YOLOV3 target detection network, particularly on considering the behavior of pheasant and specific living conditions. A lightweight network is required to ensure the accuracy and real-time monitoring due to the deployment on a mobile platform in the field environment. A real-time monitoring network ET-YOLO has also been established for the emergence of pheasants in a complex field environment, according to the basic structure of the Tiny-YOLOV3 lightweight target detection network. The feature extraction can deepen the net depth of Tiny-YOLOV3, and thereby increase the detection scale to improve the detection accuracy of original net target. CenterNet structure was used in the net detection layer to further enhance the detection accuracy and speed. The dataset of pheasant monitoring was produced after augmentation using the field collection of images in various environments, including 6000 high resolution images of pheasant in different distances, angles and environments. The indicators of experimental evaluation were mainly tested in terms of accuracy, real-time performance, and model size. Specifically, the average detection accuracy, average detection speed, and detection model size of the pheasant were used for evaluation. The experimental results showed that the average detection accuracy of ET-YOLO in the complex field environment was 86.5%, and the average detection speed was 62 frames/s, 15% higher than that of initial Tiny-YOLOV3. The average detection accuracy was higher than that of YOLOV3, Faster-RCNN and SSD_MobileNetV2 by 1.5%, 1.1% and 18%, respectively. The average detection speed was 38 frames /s, 47 frames /s and 1 frame/s higher than that of YOLOV3, Faster-RCNN and SSD_MobileNetV2, respectively, when the detection model size of 56 MB. The proposed method can be suitable for the deployment on embedded computing platforms equipped with agricultural robots and intelligent machines in terms of recognition accuracy, real-time performance, and model size, particularly recognizing pheasants in complex environments.