Extracting navigation line to detect the maize seedling line using median-point Hough transform
-
-
Abstract
Automatic navigation has been crucial to realizing robot automation in agricultural robots. Most current extraction approaches to navigation paths were clumsy and time-consuming susceptible to interference. It is a high demand for the real-time, high accuracy, and robustness of the navigation line extraction in plant protection robots. Taking the early maize seedlings as the research object, a navigation line extraction of a plant protection robot was realized to detect the crop line using the median-point Hough transform. The angle traversal range of the intersection point was reduced than before. Only characteristic point curves were calculated for the walking control of the robot, such as the intersection point of the median point curve. Five steps were contained in the navigation line extraction of the plant protection robot: image acquisition, image segmentation, crop line feature point extraction, crop line fitting, and navigation line extraction. Firstly, the RGB images were preprocessed to highlight the green crops in the soil background, where the color RGB images were grayed to improve the gray factor. Then, the Otsu method was selected for the adaptive threshold to search for the crop rows. The image binarization was segmented into the soil background and crop. As such, the local noise was filtered to clearly segment the crop row images, according to multiple morphological operations. A vertical projection was also utilized to divide the region of interest of the crop row along the abscissa of the pixel coordinate system. Secondly, the feature points were extracted for the line fitting of the crop row. The median value point Hough transform was then used to fit the crop line between the two sides of the ridges. Finally, the angle tangent formula was used to extract the path navigation line of the plant protection robot, where the detected crop line between the two sides of the ridges was taken as the reference. The results showed that the improved gray factor clearly separated the crops and soil, according to the field experiments. A 640 pixel × 480 pixel color image was also processed less than 160 ms on average, indicating better real-time performance. The maximum error of the navigation baseline was 0.53o using the improved Hough transform to fit the crop line. There was 62.9 ms faster than the traditional Hough transform, and 7.12o more accurate than the least square method. The accuracy rate of navigation line extraction reached more than 92%, indicating the strong robustness and accuracy in various environments, such as standard plant spacing under low light on cloudy, standard plant spacing under strong sunlight on sunny days, standard plant spacing on cloudy days, non-standard plant spacing on cloudy days, standard plant spacing on cloudy days containing a small number of weeds, and standard plant spacing on cloudy days containing a large number of weeds. The extraction also presented better applicability and accuracy under the multiple environmental variables. The finding can provide a visual navigation line extraction for the plant protection operations in the crop line of the green drill.
-
-