Abstract:
Feedstuffs can often be classified and then stored to enter the silo in feed processing. An accurate and rapid identification of the feedstuffs has been one of the most important steps at present. However, manual detection cannot fully meet the large-scale production of incoming feedstuffs in realtime. Timely feedback is also required for high efficiency, low intensity, and cost saving. This study aims to realize online sampling and identification of feedstuffs for the high automation level of feed processing. A multi-channel sampling device was designed for feedstuffs. An online identification system was also developed using machine vision. Three units were then divided into sampling, sample conveying, and image acquisition. The structural parameters were determined for the automatic sampler, conveying, and image acquisition device, such as the sampler installation angle of 60°, sampling time of 3 s, and conveyor belt conveying speed of 9 cm/s, according to the actual production. The online identification software was developed for feedstuffs using a PyQt5 environment, including an upper computer human-computer interaction software and a lower computer automatic control system. Among them, each function was divided into independent modules to facilitate the maintenance of functional programs, such as command sending, real-time image display, image processing and identification. Arduino Uno was selected as the system control core in the lower computer control system, in order to design the control process and circuit. The control programs were written in the development environment of Arduino IDE. At the same time, secondary development was carried out on the industrial camera to realize the real-time control of industrial camera using soft triggering. The upper computer software then communicated with the lower computer controller via the serial port, in order to obtain the automatic control of the online sampling and identification device. The CAM-ResNet18 model was constructed to identify the types of feedstuffs using convolutional neural networks (CNN).Experimental results showed that the identification accuracy of the model reached 99.4% in the test set, while the recall rate,
F1 value, and specificity were 99.4%, 99.4%, and 99.9%, respectively. The model performed excellently to identify the feedstuffs. The basic functions, identification accuracy and time were tested after the model was embedded in system integration. The system ran normally and reliably on the intelligent operation, including automatic sampling, image acquisition, type identification, feedback, and one-click alarm, when feedstuffs entered the silo. The system performance test showed that the identification accuracy of feedstuffs was 98%, with an accuracy of 100% for eight feedstuffs, including rice, soybean meal, bran, flour, cottonseed meal, puffed corn, fish meal, and corn. The identification accuracies of peanut meal and wheat were 90%. The image processing can be further optimized to improve the identification accuracy of individual feedstuffs, such as peanut meal and wheat. The online sampling and identification time was 10.13 s. The online identification system can fully meet the requirements of online sampling and identification for the incoming feedstuffs. The finding can also provide promising technical support for the automatic identification of feedstuffs in feed processing.