Zhai Zhiqiang, Xiong Kun, Wang Liang, Du Yuefeng, Zhu Zhongxiang, Mao Enrong. Crop row detection and tracking based on binocular vision and adaptive Kalman filter[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2022, 38(8): 143-151. DOI: 10.11975/j.issn.1002-6819.2022.08.017
    Citation: Zhai Zhiqiang, Xiong Kun, Wang Liang, Du Yuefeng, Zhu Zhongxiang, Mao Enrong. Crop row detection and tracking based on binocular vision and adaptive Kalman filter[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2022, 38(8): 143-151. DOI: 10.11975/j.issn.1002-6819.2022.08.017

    Crop row detection and tracking based on binocular vision and adaptive Kalman filter

    • Abstract: A crop row in field has been an indicator of road information for vehicle navigation in intelligent agriculture. However, the current detection for the crop row cannot suit the complex field. The precision and accuracy of crop row detection are also sensitive to environmental interference. In this study, an improved detection was proposed to track the crop row for the navigation parameters using binocular vision and an adaptive Kalman filter. The improved system consisted of crop row detection and tracking. Firstly, an image preprocessing algorithm was established to segment the gray features of vegetation using the improved Excess Green Minus Excess Red (ExG-ExR) model and the maximum interclass variance method (OTSU). Corner points in the grayscale image were extracted to describe the shape of the crop row canopy using the smallest univalue Segment Assimilating Nucleus (SUSAN) detector. The three-dimensional coordinate of the corner point was computed using stereo vision with stereo matching and parallax ranging. Secondly, the points with three-dimensional coordinates within the threshold range were extracted to serve as the crop row feature points. A line model was established to indicate the centerline of the crop row using the Principal Component Analysis (PCA). The fit points of straight line were distinguished, according to the frequency square analysis of feature points. After detecting the centerlines of crop rows, a target planning algorithm was designed to extract the crop row located in the central area of the image for crop row tracking. Thirdly, the pathway was taken as the tracking target. A linear system was simplified for the process of crop row detection, considering the agronomy of field crops. Finally, the model of crop row tracking was established using the Sage-Husa Kalman filter. The videos of the cotton field were captured to verify the improved system using a binocular camera (BB2-08S2C-25). The image data consisted of shadows, weeds, turnrows, uneven roads and other field scenes. The result showed that accurate and rapid detection was realized for the centerline of the crop row using the improved system. The higher performance of crop row detection was achieved than before, where the accuracy and average duration were 92.36% and 80.25 ms per frame. Specifically, the average, the standard deviation, and the maximum absolute deviation of centerline head angle were 0.31°, 2.55°, and 12.01°, respectively, whereas, the average, the standard deviation and the maximum absolute deviation of centerline lateral deviation were -1.90, 8.19, and 38.18 pixels, respectively. More importantly, the crop row tracking using Sage-Husa Kalman performed much better than the classical Kalman-based tracking, indicating the rapid tracking without hysteresis to correct the false or missed centerline detection of crop row. The heading angle was defined as the deviation angle between the target and the current heading of the vehicle. The viewpoint intercept was the transverse intercept at 1m depth of field, which was used to describe the target position. The direction and position accuracy of centerline estimation were improved after the crop row tracking, compared with the simple crop row detection. In the performance of crop row tracking, the standard deviation of heading angle was 2.62°, and the maximum amplitude was 9.19°, which were reduced by 22.94% and 43.69%, respectively, compared with the non-tracking state; the standard deviation of viewpoint intercept was 0.043 m and the maximum amplitude was 0.145 m, which was reduced by 10.42% and 5.23%, respectively; the standard deviation of the difference between heading angle and viewpoint intercept of the centerline in two adjacent frames were reduced by 67.02% and 40.00%, respectively. Consequently, the improved system can quickly and accurately perceive the heading and position of the crop row. The finding can provide continuous and stable guidance parameters for the navigation system.
    • loading

    Catalog

      /

      DownLoad:  Full-Size Img  PowerPoint
      Return
      Return