Abstract:
In large-scale egg-laying hen farms, a significant number of hens perished daily. Under harsh conditions, manual inspection of deceased chickens had numerous limitations. Consequently, the utilization of inspection robots to replace manual inspection emerged as a new trend. To enable egg-laying hen house health inspection robots to swiftly and precisely detect deceased chickens in complex environments, this study introduced a dead chicken detection method for caged egg-laying hens based on an enhanced YOLOv8n model. The research subjects were four-tier caged laying hens on a certain large-scale laying hen farm in Zhejiang Province. In order to improve the recognition effect and avoid misjudging the lying chicken as dead chicken. Thus, a "lie" label was added during the annotation process. The dataset is defined with a total of three labels: normal, lying, and dead, with the label names being "normal", "lie", and "dead", respectively. In the constructed dataset of 4,000 images, dead chickens were present in 2,000 images and lying hens were present in 1,000 images. The dataset was randomly divided into a training set, a validation set, and a test set at a ratio of 6:2:2. This method, based on YOLOv8n, initially employed Cross Stage Partial Hetconv (CSPHet) to replace the C2f structure within the network, thereby focusing the gradient changes of feature maps across various levels. Additionally, smaller 1×1 convolution kernels were utilized to substitute some 3×3 convolution kernels, effectively minimizing the model's parameters and computational burden while maintaining detection precision. Secondly, due to the acute occlusion issue in high-density caged chicken farming, occlusion among egg-laying hens resulted in data loss. Therefore, a Spatially Enhanced Attention Module (SEAM) was integrated into the Neck layer to compensate for the response loss in occluded regions and enhance the detection precision of occluded targets. Lastly, Dynamics Upsampling (DySample) was introduced to bolster the resolution of upsampled feature maps and emphasize the effective features of layer regions. Experiments comparing the effects of different input sizes on the training results show that when the input image size is 800 × 800 pixels, the precision, recall and mAP are the highest, which are 96.5%, 98.0% and 94.4% respectively. When the input image size is 640 × 640, the recall is the highest, which is 98%. The precision and mAP are 1.3% and 0.1% lower than the size of 800 × 800, but the required GPU is reduced by 24.3%. Considering that the edge computing device of the robot needs to process the data of four cameras at the same time, it is more appropriate to set the input image size to 640 × 640.The original YOLOv8n's bounding box precision (Box(p)) was 92.6%, and after incorporating the SEAM attention mechanism, it improved to 93.2%, marking a 0.6 percentage point increase, which confirmed the ameliorative effect of the SEAM mechanism on occlusion. The mAP of the enhanced YOLOv8n algorithm for dead chicken detection stood at 95.8%, with an precision of 96.6%, 2.46MB of parameters, and an inference time of 25.8 milliseconds per image. Compared to the original YOLOv8n, the mAP and a precision increased by 1.5 and 1.4 percentage points, respectively, while the number of parameters and the inference time per image decreased by 18.3% and 15.7%, respectively. In addition, based on the good real-time performance of the Yolo series models, this study compared the improved YOLOv8n model with other Yolo series models, such as YOLOv5n, YOLOv6n, and the original YOLOv8n. The results showed that the improved mAP increased by 2.5%, 2.8%, and 1.5%, respectively, and the parameters and detection time were optimal. To validate the stability, applicability, and real-time performance of the enhanced YOLOv8n model, it was successfully deployed on the inspection robot's edge device. By configuring the Real-Time Streaming Protocol (RTSP), real-time video streams were acquired, enabling real-time detection and recognition of deceased chickens among caged egg-laying hens. This research provides technical support for the detection of dead chickens by robots in the facility environment based on edge devices, and also provides important reference value for promoting the intellectualization of layer cage.