Abstract:
Navigation of robots has been required to accurately identify the working channel line in the narrow paths inside chicken coops under low lighting conditions. This study aims to detect the centerline of the chicken coop path using 3D LiDAR. Then, 3D LiDAR was equipped on the robot body to collect path information within the operation channel. Various preprocessing techniques were applied, such as direct filtering, ground point filtering, voxel filtering, statistical filtering, and point cloud projection. The point cloud data was obtained to classify in the region of interest (ROI) of the 3D LiDAR on the XOY plane, according to the size of the vertical axis. The center points of the left and right clusters were then selected from the two sets of point cloud datasets after rough classification. The distance between the transverse axis and point was used as the clustering function of K-means clustering to classify the left and right point clouds. Then, the longitude and latitude scanning and the secondary edge methods were used to extract the edges of the two clustered point clouds. The RANSAC was combined to calculate the fitting line equation of the channel edge. The path centerline of the operation channel was extracted using these two equations. An inspection robot was developed for the cage chicken housed to serve as the experimental platform. VLP-16 LiDAR was selected as the perception sensor to conduct the field verification in the D10 and D13 chicken houses of Deqingyuan Co., Ltd. (Beijing, China). The experimental results showed that the improved K-means clustering took an average time of 6.98 ms, with a silhouette coefficient of 0.59, a Rand index of 1.00, a clustering success rate of 84.10%, and a clustering accuracy of 100%. The average time was reduced by 29.40 ms, while the contour coefficient, the rand index, and the accuracy increased by 0.04, 0.63, and 82.41%, respectively, compared with the traditional. The success rate was slightly reduced by 0.41%. The best performance was achieved in both the initial point selection in the clustering function, rather than one single condition. The improved RANSAC shared the accuracy of 93.66% for the centerline extraction and an average error angle of 0.89°, which was 0.14 ° higher than the LSM. The average time (3.94 ms) was reduced by 6.15 ms, compared with the LSM. The improved RANSAC showed a much higher accuracy than before, when the number of iterations was set to 100. Furthermore, the maximum and average absolute error angle were both smaller than before. The improved model can be expected to detect the centerline of chicken coop paths, effectively meeting the actual requirements of real-time autonomous navigation in the cage-style chicken coop environments. The finding can provide the technical support to the autonomous navigation of detection robots in the operation channels of chicken coop.