DU Jinzhi, LI Shuqin. A kiwifruit low light flower image enhancement model based on improved GAN[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2024, 41(24): 1-7. DOI: 10.11975/j.issn.1002-6819.202405156
    Citation: DU Jinzhi, LI Shuqin. A kiwifruit low light flower image enhancement model based on improved GAN[J]. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2024, 41(24): 1-7. DOI: 10.11975/j.issn.1002-6819.202405156

    A kiwifruit low light flower image enhancement model based on improved GAN

    • Due to the phenomenon of mutual occlusion of kiwifruit flowers when growing in natural environments, the collected kiwifruit flower images have problems such as uneven brightness and noise, which are the main reasons for the decrease in recognition rate and accuracy of subsequent visual tasks. This article studies low-light image enhancement techniques and proposes a low-light image enhancement model for kiwifruit based on GAN. This article takes the flowers at the Meixian Kiwi Experimental Station of Northwest A&F University as the research object and uses digital cameras and mobile phones to capture images of flowers in different open states and at different periods. Uniformly pixel all collected images to create a dataset. The improved image enhancement model in this article has the following steps: Firstly, the model is optimized and improved based on GAN. A hybrid attention module is designed in the model generator section, which performs residual connections based on channel attention and spatial attention. The channel attention module uses ECANet to extract image features, assisting the network in capturing the brightness distribution of specific areas in the image more accurately. The spatial attention module uses SAM to extract image features and utilizes lightweight computational operations to focus the model on specific regions in the image, achieving local attention. This structure is used to assist the model generator in extracting the brightness distribution information of kiwifruit low-light flower images, enabling the model generator to generate images based on the brightness distribution of the images. Secondly, a Swin Transformer block was added at the connection between the upsampling and downsampling of the generator. The Swin Transformer block is based on the standard multi-head self-attention evolution of the original Transformer. Therefore, by utilizing the global modelling capability of the Transformer to split the input image, the dependency relationship between each module of the image can be calculated, which improves the global modelling capability and image detail feature extraction capability of the generator.Finally, in order to adaptively enhance local regions and improve global lighting, solve the problem of pattern collapse encountered by single discriminators, and provide the generator with the required adaptive adjustment capability, this paper uses a dual discriminator mechanism to enhance the perception of image details, improve the accuracy of image evaluation, and help the generator network generate clearer images. On the self-built dataset, the PSNR of this method is 7.09, and the NIQE is 10.36. Meanwhile, this method was compared with five other methods: RetinexNet, Enlighten GAN, Zero DEC, Cycle GAN, and Diffusion Low Light. In terms of peak signal-to-noise ratio, the proposed method is 0.11db higher than EnlightenGan. In terms of natural image quality assessment, this method is 12.41% higher than Zero DCE. Therefore, it can be concluded that the low light image enhancement model constructed in this article has good image quality, low distortion, good naturalness, and small colour deviation after enhancement, achieving good image enhancement effects. This model can be applied in practical production, bringing convenience to the development of computer vision and the application of smart agriculture.
    • loading

    Catalog

      /

      DownLoad:  Full-Size Img  PowerPoint
      Return
      Return