Abstract
Abstract: Accurate and real-time detection of meat adulteration has been an ever-increasing high demand in the food industry in recent years. However, the presence of mutton flavor essence and dye can make the detection more difficult than before. In this study, a residual network (ResNet) model was proposed to classify the mutton adulteration using Convolutional Block Attention Module (CBAM) combined with the inverted residual (Invert). Meanwhile, an application software was also developed to realize the rapid and accurate classification using smart phones. Firstly, the original images were collected from the mutton, three parts pork, and adulterated mutton using a mobile phone. Hough circle detection was then used to remove the background of the images. Data augmentation (such as rotation, offset, and mirroring) was used to expand the sample images. 6800 images were acquired, two-thirds of which were used as the training and testing dataset. Furthermore, the training dataset was three times larger than the testing one. The rest was then used as the independent validation dataset. Secondly, the original residual structure of the ResNet framework was replaced by the Invert structure, in order to reduce the number of network parameters for the high convergence speed. At the same time, the CBAM was introduced into the Invert structure. As such, the feature difference was strengthened to redistribute the feature weights in the spatial and channel. The convolution neural network (CBAM-Invert-ResNet) was then developed using the sample data. Furthermore, the MobileNet and resnet50 were also developed using the same data to compare the convergence speed and accuracy of the model. Finally, the CBAM-Invert-ResNet network model was transplanted to mobile phones by the TensorFlow Lite framework and Android Studio development environment. The mobile terminal classification was realized in real time. The results showed that the CBAM greatly enhanced the feature difference among categories, whereas, the Invert significantly reduced the parameters and size of the network, indicating the accelerated convergence speed. The parameters of Invert-ResNet50 model are 9.85×106, and the model size is 18.66 MB, which were reduced by 58.25% and 58.43% compared with the ResNet50 model. Specifically, the parameters of the CBAM-Invert-ResNet50 model were 1.002×107 with a model size of 19.11MB, which were reduced by 61.64% and 61.59% compared with the CBAM-ResNet50 model, respectively, compared with the ResNet50 model. The convergence speed of the CBAM-Invert-ResNet50 model was much faster than that of the ResNet50 one. There were also many more outstanding differences in color during feature visualization of the mutton, adulterated mutton, and pork using the CBAM-Invert-ResNet50 model. The classification accuracies of the CBAM-Invert-ResNet50 model for the pork adulteration with the loin, hind shank, fore shank and mix parts datasets were 95.19 %, 94.29 %, 95.81 %, and 92.96% in validation dataset, which were improved by 6.08、2.62、14.70 and 4.23 percentage points compared with the Invert-ResNet50 model, respectively, compared with the ResNet50 model. Furthermore, the classification accuracies of the CBAM-Invert-ResNet50 model were improved by 12.44, 9.6, 13.73, and 4.87 percentage points, respectively, compared with the MobileNet. Moreover, the application software with the CBAM-Invert-ResNet50 model was developed to quickly and accurately classified mutton, pork, and mutton adulteration with the different ratios of pork ingredients. The detection time of each image was about 0.3 s in the mobile terminal. The rapid and accurate classification was realized for the mutton adulteration with the pork under the effect of mutton flavor essence and dye. The finding can provide technical support to maintain the market order in food safety.