Research and field experiment on visual navigation method of cotton spraying robot
-
Graphical Abstract
-
Abstract
Visual navigation has become the mainstream navigation method for agricultural field robots. The real scene in the field is very complicated by the influence of light changes and plant growth differences. In order to solve the problem of poor crop path extraction and navigation accuracy caused by sparse cotton seedlings, lack of seedlings and weeds in cotton seedling stage scenario, a visual navigation method based on improved RANSAC (Random sample consensus) was established and cotton field navigation path tracking experiment was carried out. Firstly, the images of multiple growth stages of cotton seedling stage were obtained using agricultural robot with camera. Then the crop rows and background were distinguished by improved Extra green gray feature transformation. The crop row and background were completely separated by adaptive threshold segmentation method, and the binary images were denoised using morphological filtering. According to the distribution characteristics of the target region of the crop row in the image, and the outlier points that do not belong to the crop row were eliminated by improved RANSAC algorithm. The feature point detection and clustering were carried out to ensure the accuracy of the final extracted crop row center line. Finally, the navigation path was obtained by by the least square fitting method. The path fitted by modified RANSAC to remove outliers is more in line with the actual position of the crop line center line, while the center line fitted by direct least squares without RANSAC algorithm has obvious deviation. The objective index of experimental results show that the line recognition rate with traditional RANSAC algorithm is 96.5%, the average error angle is 1.41°, and the average image processing time is 0.087 s. After removing outliers by improved RANSAC method, the line recognition rate is increased to 98.9%, the average error angle is only 0.53°. The performance of center line extraction has been significantly improved when modified the original RANSAC algorithm. The comparison experiment between proposed method and traditional Hough transform also illustrates the effectiveness of our method for navigation path extraction. For better validate the practical application effect of this method in complex environment, the self-developed spraying robot and its visual navigation system were used to carry out path tracking experiments in the field of cotton seedling. The navigation experiment was conducted autonomously under three different initial states and three different moving speeds including 0.4, 0.5 and 0.6 m/s. Configuring Open CV in robot ROS gives robots image processing capabilities, an open source library was equipped to provide support for machine vision and image processing. In order to improve the tracking control accuracy, a path tracking algorithm based on adaptive sliding mode control is utilized. When the speed is 0.4 m/s, the maximum lateral deviation of the robot is 1.53 cm. When the speed is 0.5 m/s, the maximum lateral deviation is 2.29 cm. When the speed is 0.6 m/s, the maximum lateral deviation is 2.59 cm, which can meet the precision requirements of the application robot for line operation. The maximum lateral deviation of field path tracking experiment is less than 2.59 cm, and no rolling phenomenon occurs, which meets the requirements of autonomous navigation operation of the cotton field under the planting mode of "1 film, 3 ridges and 6 rows". The visual navigation method established in this paper provide theoretical support and technical basis for autonomous navigation and mobile operation of agricultural robots in farm.
-
-