融合YOLOv5s与通道剪枝算法的奶牛轻量化个体识别方法

    Light-weight recognition network for dairy cows based on the fusion of YOLOv5s and channel pruning algorithm

    • 摘要: 实时准确地识别奶牛个体身份是构建完善的奶牛精准养殖技术架构的先决条件。如何在快速精准识别奶牛个体的同时保证模型的轻量化是至关重要的。该研究提出了一种在低计算量和低参数量条件下快速准确识别奶牛个体身份的方法。研究采用YOLOv5s作为原始模型,利用BN层中缩放因子对模型中通道的重要性进行判断并剪除不重要的通道,从而降低网络复杂度。为了更加有效地压缩模型,该研究在损失函数中增加稀疏损失项,实现模型通道的稀疏化。测试试验结果表明,剪枝后的模型平均精度mAP为99.50%,计算量为8.1 G,参数量为1.630 M,每秒帧数为135.14 。相比其他具有代表性的目标检测模型,该研究方法拥有最小的模型复杂度。此外,相比其他模型,该研究方法对奶牛斑纹特征依赖程度更低,在低照度条件下有着更加出色的表现。考虑该方法具有快速、准确、鲁棒、低计算量和低参数量的特点,在推进养殖场中奶牛精细化养殖方面具有巨大潜能。

       

      Abstract: Real-time and accurate individual identification of dairy cows is a prerequisite for building a perfect technical architecture for precision dairy farming. It is crucial to ensure that the identification model is lightweight while identifying individual cows quickly and accurately. In this research, a fast and accurate identification model of individual cows with low computation and small number of parameters was proposed. YOLOv5s network was selected as the original model. The scale factor in the batch normalization layer was used as the basis for judging the importance of the channel in the model for reducing the network complexity. In order to compress the model effectively, sparse loss term was added to the loss function to sparse model channels. Experimental results demonstrate that the mAP of the pruned model was 99.50%, the floating point operations (FLOPs) was 8.1 G, the number of parameters (Params) was 1.630 M, and the detection speed was 135.14 frames/s. Among all the similar methods which have been compared, the proposed method has the smallest model complexity. Moreover, the proposed model was less dependent on coat patterns and had better performance under low illumination conditions than other models in robustness. The proposed method has the characteristics of fast, accurate, robust, low computational cost and small number of parameters. It is of great potential in advancing the refinement of cow breeding on farm management.

       

    /

    返回文章
    返回