Abstract:
Active vision on application of agricultural field, especially with the boundary tracking of cut-uncut crop surfaces with similar colors, is quite a challenge. Two novel methods were proposed for its fast segmentation in order to navigate agricultural robot. The key to efficiency is based on a narrow band extraction of multi-scale features from regions of interest (ROI) and the multi-cues enhancement of pixel-rows. The former is related to the weighted mean of k-level extreme values of pixels. The latter is related to the feature enhancement of neighborhood rows and multi-evidence fuzzy recognition. The two approaches are nearly unsupervised and their guidance line is able to be adaptive to a changing environment. A real-time method of projective transformation (with less than 1 ms of parameter processing) and an auto-calibration method for camera's main pose (with the time cost less than 0.5 s) were presented. Software for analyzing the cut-uncut lawn was developed. Experimental results were promising, in which correct segmentation was achieved within 55ms at 160×120 resolutions with an average error below 5% for normal sequence, and the online boundary tracking of cut-uncut lawns was done autonomously at the speed of 8~9 frame per second (FPS) under 160×120 resolutions, based on a trade-off combination of the multi-rows Best Fit Step (MR-BFS) and the multi-evidence fuzzy enhancement from pixel-rows. If the color distances between sides of tracking boundary are relatively larger, the present method of color components operation plus bit-mask may be a good choice for multi-boundary tracking in the field, with full segmentation done within 20~30 ms for color sequence of 320×240 resolutions. All the technique can be further used in real-time control over agricultural robot navigation without the involvement of human.