Abstract:
Localization is crucial for the mobile robot system. Various localization methods have been proposed, such as light detection and ranging (LIDAR), ultrasonic method, WIFI, ultra wideband (UWB), vision odometry, and so on. The vision-based localization method has been intensively researched recently. Many researchers focus on the visual odometry, such as the monocular odometry and the stereo odometry. But the most obvious weakness of the visual odometry is the cumulative error, which will increase as the increasing of the displacement. To solve this problem, the artificial landmarks are adopted for absolute localization. But it needs previous setting and calibration. According to these observations, a mobile robot localization algorithm based on monocular vision and natural landmarks was proposed. The ordinary camera and omnidirectional camera are adopted in this method. The ordinary camera is used to extract the features from the ceiling for visual odometry, and the omnidirectional camera is used to extract the features from the environment and ceiling; then a natural landmark base is built using the Harris-SIFT (Scale-invariant feature transform) method. If there are not natural landmarks, the localization is implemented according to the change of the feature point between 2 neighboring images. If there exist natural landmarks, they will be used to localize the robot. During the visual odometry process, the feature point is extracted from the ceiling image and then tracked the feature point through the whole trajectory. The rotation angle and translation matrix can be determined by the coordinates of the feature point in the image and the world coordinate system. If the feature point is far away from the image center, it should be reselected so that it is close to the image center. The local search method for feature point is adopted to accelerate the localization. The search of feature point is only implemented near the feature point in the last frame. If there exist natural landmarks, the orientation filter is used to estimate the orientation angles. In this filter, the indirect Kalman filter is adopted, which uses the orientation error space as the state of the filter. Moreover, the orientation filter fuses the data of visual odometry and the inertial measurement unit to calculate the orientation angles. Then, the Harris-SIFT algorithm is used to determine the feature point for localization. To fully remove the outliers, the cluster analysis is implemented to cluster the matching points. And then the optimal matching is used to determine the feature point. The localization is implemented considering the orientation angles and the feature point obtained above. This algorithm has both of the advantages of the real-time performance of relative localization and the accuracy of absolute localization. The indoor experiment was implemented. The average errors of the proposed method in x axis and y axis were 38.66 and 31.54 mm, respectively. Compared with the visual odometry, the errors were decreased by 32.53% and 68.68%, respectively. As to the computational efficiency, the runtime of the localization based on only natural landmarks was almost 6 times higher than that of the proposed method. Thus, the proposed method can achieve real-time performance. The indoor experimental results demonstrate that compared with the visual odometry, the proposed method is much more accurate, and has a better extendibility.