Abstract:
Abstract: Autonomous navigation has been widely used in agricultural robots in China. However, the tires of wheels can often press the seeding along the ridges during operation, when the field fertilizer or pesticide applicator is suspended from high-clearance chassis. Taking the sugarcane as the research object, this study aims to propose a real-time extraction of navigation lines between crop ridges using Light Detection and Ranging (LiDAR). Firstly, the navigation line was extracted to obtain the point cloud of sugarcane in front of the high-clearance chassis. The point cloud was then preprocessed by point cloud transform, pass through, and radius outlier removal, due to the complex environment between ridges. As such, the accurate point cloud data was achieved for the reliable identification of crop rows. Secondly, a real-time extraction of navigation lines was implemented to pre-process the point cloud of the first frame using the Region Of Interest (ROI). The vertical projection was then used to determine the ROI. Specifically, the point cloud bands were divided to determine the wave number set, the centroids of sugarcane rows for the candidate point cloud band, the actual centroids of cane row, and the ROI endpoint coordinates. Thirdly, the specified square with the highest number of points was calculated on each side of the ROI centerline, where the feature points were extracted to fit using the K-Means clustering. Lastly, the centerlines of sugarcane rows were fitted to determine the navigation line and Angle using the Least-Squares Method (LSM). A field experiment showed that the model presented excellent robustness for the low and high occlusion environments between sugarcane ridges, particularly for the navigation line in the case of broken ridges. A driving test was also carried out using the identification of sugarcane rows in Zhuguang Farm, Hepu County, Beihai City, Guangxi Province, China, on June 22, 2021. Among them, the sugarcanes under the low occlusion were planted in rows 100 cm apart with the average plant height of 100 cm, whereas, the sugarcanes under the high occlusion were planted in rows 100 cm apart with the average plant height of 150 cm. The high-clearance chassis was controlled at 0.5 and 1.0 m/s, respectively, under the low and high occlusion. The results show that the better performance of the new model was achieved to extract the navigation lines between sugarcane ridges under low and high occlusion, as well as the case of broken ridges. The manually extracted navigation lines of sugarcane rows were used as the evaluation criteria to verify the accuracy of the model. The angle between the manual and the navigation line extraction was defined as the error angle, further to evaluate the accuracy of the extraction. Only the error angle less than 5° was considered as the better extraction of navigation lines. An optimal combination was achieved, where an average error of no more than 1.213° between the manual and navigation line extraction under different working conditions, an overall accuracy of no less than 93.2%, and an average time of no more than 22.5 ms, fully meeting the accuracy and real-time of the navigation lines extraction between sugarcane ridges. Therefore, the new extraction performs better than the previous machine vision, in terms of the average error angle of navigation lines, the accuracy of navigation line extraction, and processing time. Compared to the machine vision algorithm, the average error angle of extracting navigation lines in this algorithm reduced by 2.832°, the accuracy improved by 44.7 percentage points, and the average elapsed time reduced by 175.9 ms. The finding can also provide reliable and real-time navigation paths for the inter-monopoly-to-row travel of field management machinery.