Abstract:
Livestock breeding is often required for animal growth and development. Among them, the systematic evaluation of livestock body measurements can also be highlighted to represent the animal growth and developmental stages. Such measurements are of great importance for the decision-making on the overall breeding. Manual contact measurements have been used in traditional practices. However, manual contact is usually susceptible to subjective errors, due to the cumbersome, time-consuming, and labor-intensive tasks. It is very necessary for the accurate data of the correct decisions. Fortunately, machine vision has revolutionized the agricultural industry in recent years. The contactless body measurement can be expected to replace the manual contact measurements using machine vision. The potential stress reactions can also be prevented to reduce the labor intensity in livestock breeding. This study aims to review the research progress of the non-contact livestock body measurement using machined vision. Four commonly large-bodied livestock were selected, including cattle, sheep, horses, and pigs. Initially, the common acquisition of livestock images was outlined to evaluate the types of imaging devices and various deployments. All tasks were aligned with the body size measurement. Subsequently, machine vision was applied to the contactless body measurements of livestock over the past five years. The current research status of image segmentation was also summarized during livestock body measurements. The speed, accuracy, and portability of equipment were then concentrated mainly on the body measurement at present. Several challenges were proposed, including the limited supply of public datasets and deep learning in the deployment of the algorithms in real-world environments. As such, the generative models can be expected to augment the dataset of the livestock images. Deep learning can be promoted to develop the generalized measurement suitable for a wide range of livestock. The findings can also provide valuable insights and references for future research.