Abstract:
Abstract: In wine brewing process, the most time-consuming and laborious is grape picking. In order to improve the target location accuracy and work efficiency of grape-picking robot, reduce the mechanical damage that was caused by improper positioning of grape picking point, in view of some influence factors such as the various colors of grape stem and the irregular contour of grape, which make picking robot hard to locate picking point accurately, a new method of picking point location based on the improved image segmentation with clustering and the constraint by minimum distance between dot and line was put forward in this paper. Because the picking time was often chosen in sunny or cloudy day, 300 images of summer black grape were collected using D5200 Nikon digital camera in sunny or cloudy days, which were taken as test materials; the shooting distance between camera and grape cluster was about 80 cm, and the sizes of these images were zoomed to 800×600 pixels. Firstly, the color space of gathered images was analyzed and the component H of HSI color space that can mostly highlight summer black grape was found. The H component of images was extracted and median filtering was performed on these images. Grape fruit image was segmented by using fuzzy clustering that was improved by artificial swarm. Solving the minimum value of fuzzy objective function of FCM clustering algorithm was transformed into solving the maximum value of artificial swarm fitness function by improving the fitness function of the artificial swarm optimization algorithm. Then, the segmented image was processed with morphological denoising, the maximum connected region was extracted, and the regional barycenter, extreme point and external rectangular were calculated. Secondly, the interest region of picking point was determined according to the barycentric coordinates and edge information of grape image. Taking 0.6 times of the length of the external rectangular as the length of region of interest (ROI), and taking 0.5 times of the vertical distance between hightest point of profile and barycenter as the height of ROI, the image of ROI was preprocessed with bilateral filter, and its edge image was extracted through gradient edge detection to obtain binary edge. And Hoffe's straight line testing of cumulative probability was taken in this ROI, maximum votes were taken to 18, all the detected endpoints of line were recorded, and linear equation was established by the two endpoints; the distances between all the detected straight lines and barycenter were solved by using the method of shortest distance from point to straight line, the straight line whose distance to barycenter was the smallest was chosen as the line where picking point was located, and the midpoint's coordinates of elected line was used as picking point. Finally, in order to verify the method proposed in this paper, a classification experiment based on 300 images of summer black grape gathered under direct sunlight, backlighting in sunny day and overcast light was carried out, using Opencv2.3.1 and Visual C++ as the algorithm programming platform. Through the statistical analysis of 300 testing images, it was discovered that the ratio of location error within 12 pixels between obtained picking point using this paper's method and optimal picking point by hand could reach up to 88.33%, and these obtained points were located in the stems of grapes, which could meet the requirements of picking robot's location. When grape clusters were located in both sides of image frame and their covered parts were less than one-third, and meanwhile the stem located in the ROI had not been covered mostly, the method could be well to solve the position of grape picking point. The average elapsed time of picking point's positioning in natural conditions was 0.3467 seconds. The test results of location error and elapsed time show that the method proposed in this paper can meet the demand of picking robot on picking point's positioning and is a new solution method of picking point for grape-picking robot.