Monitoring of maize phenotypic traits using super-resolution reconstruction and multimodal data fusion
-
-
Abstract
Abstract: High-throughput phenotyping has posed an urgent challenge on plant genetics, physiology, and breeding at present. Particularly, traditional manual cannot meet the needs of high-throughput phenotyping for breeding, due mainly to time-consuming and labour-intensive work with a limited sample size. Alternatively, the Unmanned Aerial Vehicle (UAV) Remote Sensing can be widely expected to serve as an important tool for crop phenotypic parameters. The main reason can be the high temporal and spatial resolution, fast image acquisition, easy operation and portability, as well as relatively low cost. However, it is also inevitable to balance the flight height and image resolution or accuracy during image acquisition. Efficient techniques are urgently needed to reconstruct the high-resolution images without lossing the measurement accuracy, while improving the spatial resolution and image acquisition. In this study, the maize phenotypic traits were effectively monitored using super-resolution reconstruction and multimodal data fusion. The UAV image sequences of maize were also captured at seedling, 6th leaf, 12th leaf, tasseling, and milk stage. The super-resolution images were then reconstructed combined with the wavelet transform and bicubic interpolation. The reconstructed images presented higher reconstruction quality, less distortion with peak signal-to-noise ratio of 21.5, structure similarity of 0.81, and mean absolute error ratio of 6.4%. A lower error was also achieved for the plant height and biomass estimation with the root mean square error of 3.9 cm and 0.19 kg, respectively. Ground Sampling Distance (GSD) of the reconstructed image at a flight height of 60 m was similar to that of the original image at a flight height of 30 m. Subsequently, the UAV at a flight height of 60 m was utilized to scan 0.2 hm2 larger fields per minute than that at a flight height of 30 m. The plant height, canopy coverage and vegetation index were also extracted from the original and reconstructed images. Leaf area index was calculated by point cloud reconstructed by oblique photography. The original shape of point cloud was remained, while point cloud was compressed for a higher efficiency using 3-D voxel filtering. Specifically, a better correlation was achieved, where the measured LAI was the slope of 0.72 and the root mean square error of 0.14. All canopy structure, spectrum and population structure parameters were then used to construct estimation models of above ground biomass using single characteristic parameter and multimodal data. A higher estimation accuracy of above ground biomass was obtained by multimodal data fusion, compared with a single parameter with the coefficient of determination was 0.83 and root mean square error of 0.19 kg. Therefore, a combination of image super-resolution reconstruction and multimodal data fusion can be widely expected to deal with the canopy saturation for higher spatial resolution and estimation accuracy, indicating fully meeting the demand for higher throughput of data acquisition. Meanwhile, the finding can provide a highly effective and novel solution to the estimation of above ground biomass. More importantly, the correlation between genotype and phenotype can also be extended to cultivate high-quality maize varieties suitable for mechanized production.
-
-