Abstract:
In recent years, disease becomes one of the important factors that restrict the production and quality of wine grape. At present, most of disease recognition of plant is actually carried out with static image. Some blobs, arising from soil spots, bird shits, pesticide stains, and so on, are often similar in color or shape with scab caused by diseases, and may be misclassified as disease. To accurately judge the illness of a leaf for online surveillance, it is important to consider the time factor. The strategy of continuously monitoring the variation of blobs on a leaf over time helps to improve accuracy of disease recognition under natural conditions. In this paper, we presented a dynamic disease monitoring method for wine grape, which inferred whether the disease had existed not only by the disease classifier but also by the status changing observed over time from sequential images. We firstly detected the grape leaves in the first frame of the video by Faster R-CNN (region-based convolution neural network) every day, and then tracked them in the following frames to find out the frontal snaps of leaves. These snaps were intercepted from the bounding boxes in the frame, which were stored in a database as leaf images. In terms of tracking, an algorithm was proposed, which combined cosine distance metric of movement with appearance information, to solve the problem that a leaf could not be tracked due to occlusion. We built a wide residual network which was used to extract the apparent characteristics when performing surface matching in this paper. Since the blades detected in the first frame of the monitoring video were not correct, we tracked these leaves over a period of time, and then intercepted the image with bounding box when the posture was the best. To recognize the same leaf from sequential images over days, SIFT (scale-invariant feature transform) based matching was performed. If the matching rate of the 2 blades exceeded a predetermined threshold and is the highest among all the blade pairs, the 2 images are considered as the same leaf. For the image sequence of a leaf, a process of disease detection is then carried out to detect whether diseases exist. The detector of disease also adopts Faster R-CNN framework. Interception of frontal leaf was good, which removed most background, and the accuracy of detection was improved remarkably. When the detector outputted a bounding box which indicated a disease scab, a process of automatic segmentation based on graph cut was implemented to segment the scab from the image. The goal of the process was estimate the area of scab on an image. We further compared the area of scabs and the number of scabs from the same leaf on the images if the detector asserted that there had existed disease. Once the area or the number was increased over time, we could confirm the assertion. If not, we believed that misrecognition occurred. We conducted experiments to evaluate the performance of our method. For leaf tracking, the experimental results showed that the average multiple-object tracking accuracy (MOTA) of the proposed tracking algorithm is 73.6% and the multiple-object tracking precision (MOTP) is 74.6%, surpassing the algorithm for comparison. For leaf matching, the accuracy of our SIFT-based method achieved 90.9%, which could meet the requirement in practice. In short, besides scabs detected from a static image, our method introduced time factor to judge the developing trend of the scabs from sequential images, which eliminated false alarm and improved the accuracy and robustness of grape disease diagnosis. With the proposed method, we can realize the online monitoring of grape leaf in natural environment. At present, the method can only estimate the scab area of the leaf with positive posture. In the future study, we should solve the problem of scab area estimation which is irrelevant to the angle of view.