Edge fusion-based image segmentation model for orchard overlapping grape clusters
-
Graphical Abstract
-
Abstract
A grape has been one of the most important economic fruits. Dense planting and robotic harvesting have been widely applied into the greenhouse plantations in recent years. Labor costs are significantly reduced to promote the grape industry. Accurate visual segmentation can greatly contribute to the operational precision and efficiency of harvesting robots. However, some challenges are still remained on the segmentation tasks of harvesting robots using machine vision, due to the irregular contours and highly similar colors of overlapping grape clusters. Particularly, each bunch of grapes is also required for the accurate segmentation within the overlapping grape clusters. In this study, an overlapping grape cluster segmentation model (OGCSM) was developed in an orchard using edge fusion. Edge detection and semantic segmentation networks were integrated to reduce the limitations in orchard environments. An accurate and reliable solution was also provide for the automatic segmentation of overlapping grape clusters. Feature maps of overlapping grape clusters were extracted from the multi-level edge features. The OGCSM model was utilized an encoder-decoder architecture as the backbone network. Specifically, the high-level structure and deep contextual aggregation network were effectively captured the grape features. The performance of edge extraction was also enhanced to introduce the Inception module into the backbone network. The downsampling convolution layers were constructed with the large kernels. The backbone network was enhanced the receptive field and contextual awareness. The overlapping features of grape cluster were detected at the varying scales and shapes in the images. Furthermore, the similarity attention module (SimAM) and convolutional block attention module (CBAM) were integrated into the convolutional layers of the encoder and the fusion layers of the encoder-decoder. The high accuracy of segmentation was obtained to extract the edges of each bunch of grapes within overlapping grape clusters. Thereby, the improved model was better performed under challenging environments in an orchard. Alternatively, there was the fusion of edge and mask feature information from the overlapping grape clusters. Moreover, the edge and mask heads were selected as the fusion networks. Deep supervision was then applied for prediction. As such, the generated mask was complemented for the overlapping grape clusters and the edges for each bunch of grapes. Element-wise weighting was subsequently applied to fuse the mask and edge features, thus enabling the precise segmentation of the overlapping grape clusters. A series of segmentation and ablation experiments were conducted to validate the efficacy of the OGCSM model. The experimental results showed that the OGCSM model was achieved in the key performance metrics. Specifically, the high accuracy was also achieved in the segmentation, with an average segmentation intersection over union (IoU) of 92.77% for the overlapping grape clusters, and 95.07% for the top and bottom overlapping grape clusters. A precision of 98.13% and an average segmentation IoU of 92.52% were found under both sunny and cloudy lighting, indicating its strong robustness to lighting variations. Additionally, the ablation experiments demonstrated that the Inception, SimAM, and CBAM modules were significantly enhanced the accuracy by 7.4 percentage point in the average IoU segmentation. Each module was effectively improved the performance of the original model. Compared with the existing segmentation, the OGCSM model was automatically segmented the various colors and overlap types of grape clusters, indicating the superior practicality. Overall, the excellent performance of OGCSM model was achieved in the various growth scenarios of grape clusters, including the left-right, front-back, and top-bottom overlaps. Segmentation tasks were rapidly and accurately implemented for the effectiveness and robustness in practical orchard environments. In conclusion, the high accuracy metrics of the OGCSM model were achieved to validate the effectiveness of its edge extraction and fusion. A promising solution was obtained for the accurate and rapid segmentation of overlapping grape clusters in orchard. The finding can also offer a reliable and precise segmentation on the fruit overlap for the grape harvesting robots in orchard.
-
-