基于3D激光雷达的鸡舍通道中心线检测方法

    Detecting the center line of chicken coop path using 3D Lidar

    • 摘要: 针对笼养鸡舍环境下光照强度弱、作业通道内狭小导致机器人巡检时通道中心线检测困难的问题,该研究利用3D激光雷达对鸡舍通道中心线进行获取。首先通过机器人搭载的3D激光雷达对鸡舍作业通道信息进行采集,利用直通滤波、地面点滤波、体素滤波、统计滤波和平面投影对获取的3D激光雷达点云数据进行预处理,获取XOY平面上的点云数据。通过改变K-means聚类初始点选择方式和聚类函数对预处理后的点云数据进行数据分类。利用改进RANSAC算法对分类后的数据进行处理,提取通道中心线。试验结果表明该研究提出的改进K-means聚类算法平均耗时6.98 ms,相较于传统的K-means聚类算法平均耗时减少了29.40 ms,准确率提高了82.41%。该研究提出的改进RANSAC算法中心线提取准确率为93.66%、平均误差角为0.89°、平均耗时为3.97 ms,比LSM算法得到的平均绝对误差角高0.14°,平均耗时减少6.15 ms。表明该研究提出的鸡舍通道中心线检测方法基本满足笼养鸡舍环境实时自主导航的需求,为巡检机器人在鸡舍作业通道内进行激光雷达导航提供了技术支撑。

       

      Abstract: Navigation of robots has been required to accurately identify the working channel line in the narrow paths inside chicken coops under low lighting conditions. This study aims to detect the centerline of the chicken coop path using 3D LiDAR. Then, 3D LiDAR was equipped on the robot body to collect path information within the operation channel. Various preprocessing techniques were applied, such as direct filtering, ground point filtering, voxel filtering, statistical filtering, and point cloud projection. The point cloud data was obtained to classify in the region of interest (ROI) of the 3D LiDAR on the XOY plane, according to the size of the vertical axis. The center points of the left and right clusters were then selected from the two sets of point cloud datasets after rough classification. The distance between the transverse axis and point was used as the clustering function of K-means clustering to classify the left and right point clouds. Then, the longitude and latitude scanning and the secondary edge methods were used to extract the edges of the two clustered point clouds. The RANSAC was combined to calculate the fitting line equation of the channel edge. The path centerline of the operation channel was extracted using these two equations. An inspection robot was developed for the cage chicken housed to serve as the experimental platform. VLP-16 LiDAR was selected as the perception sensor to conduct the field verification in the D10 and D13 chicken houses of Deqingyuan Co., Ltd. (Beijing, China). The experimental results showed that the improved K-means clustering took an average time of 6.98 ms, with a silhouette coefficient of 0.59, a Rand index of 1.00, a clustering success rate of 84.10%, and a clustering accuracy of 100%. The average time was reduced by 29.40 ms, while the contour coefficient, the rand index, and the accuracy increased by 0.04, 0.63, and 82.41%, respectively, compared with the traditional. The success rate was slightly reduced by 0.41%. The best performance was achieved in both the initial point selection in the clustering function, rather than one single condition. The improved RANSAC shared the accuracy of 93.66% for the centerline extraction and an average error angle of 0.89°, which was 0.14 ° higher than the LSM. The average time (3.94 ms) was reduced by 6.15 ms, compared with the LSM. The improved RANSAC showed a much higher accuracy than before, when the number of iterations was set to 100. Furthermore, the maximum and average absolute error angle were both smaller than before. The improved model can be expected to detect the centerline of chicken coop paths, effectively meeting the actual requirements of real-time autonomous navigation in the cage-style chicken coop environments. The finding can provide the technical support to the autonomous navigation of detection robots in the operation channels of chicken coop.

       

    /

    返回文章
    返回