Recognizing sow parturition using lightweight model with edge computing
-
Graphical Abstract
-
Abstract
The reproductive performance of sows can play a critical role in animal breeding, particularly in the efficiency and effectiveness of selection. However, manual recordings of piglet births and their survival rates cannot fully meet the large-scale production in recent years. The high precision is often required to capture more nuanced data, such as the intervals between births. The advanced technologies can be expected to enhance both the accuracy and efficiency of animal breeding programs. In this study, a lightweight network was developed to rapidly and accurately monitor the sow birthing in real time. Specifically, essential birthing metrics were engineered to analyze, such as the number of piglets born and the precise intervals between each birth. The lightweight network was tailored for the real-time monitoring of sow birthing activities. The critical birthing parameters were obtained to significantly enhance the efficiency and accuracy of breeding programs. Initially, the efficacy of different monitoring views—specifically, single versus double-column views—were evaluated on the accuracy of the improved model. A single-column view was significantly improved to accurately monitor the birthing events. The real-time decision-making and direct implications were obtained from the breeding outcomes. Advanced video processing techniques were incorporated, such as horizontal and vertical flipping. Some challenges were remained on the dynamic changes in the sow posture and varying camera perspectives during monitoring. Moreover, different lighting conditions were adapted to capture the inherent motion blur of active piglets during birth. Color jittering and Gaussian blur were then employed to significantly enhance the robustness of the model. The reliable performance was obtained under diverse operational conditions. Further advancements were achieved through a comparative analysis of classification networks. The results revealed that ResNet50 was greatly contributed to the recognition accuracy. MobileNetV3-S was performed the best with the compact model size and superior processing speed of 505.14 frames per second, indicating the optimal operational efficiency. Furthermore, MobileNetV3-S was refined to apply with the masked generative distillation—a sophisticated technique that was effectively enhanced the network's ability to capture and interpret essential birthing features. ResNet50 was utilized as the teacher model in the practical application, while MobileNetV3-S as the student model. The training was conducted using masked generative distillation followed by dependency graph pruning. The tests were carried out on a DELL OptiPlex microcomputer. An impressive detection speed of 83.10 frames per second was achieved with a test accuracy in a single-column field of view of 91.48%. Although there was a slight decrease in the accuracy of 0.98 percentage points, the detection speed was improved by 67.13 frames per second. This improved model was then deployed at the edge for testing. The better performance was achieved in the managed farrowing intervals with a detection error of just 0.31 seconds and the duration of piglet birth events with a mere 0.02-second error. Highly efficient and exceptionally precise real-time monitoring was obtained to promote the management practices of breeding activities in complex farm environments. In conclusion, the advanced computational techniques were integrated for the transformative potential to the monitoring of sow birthing. Real-time data was acquired to combine the image processing and machine learning. Some standards can be offered for the accuracy and efficiency in livestock management. The reproductive dynamics can greatly contribute to the sustainable and scientifically-informed animal husbandry.
-
-