Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Research on a Visual/UWB Tightly Coupled Fusion Localization Algorithm

Version 1 : Received: 4 January 2024 / Approved: 5 January 2024 / Online: 5 January 2024 (10:15:49 CET)

A peer-reviewed article of this Preprint also exists.

Jiang, P.; Hu, C.; Wang, T.; Lv, K.; Guo, T.; Jiang, J.; Hu, W. Research on a Visual/Ultra-Wideband Tightly Coupled Fusion Localization Algorithm. Sensors 2024, 24, 1710. Jiang, P.; Hu, C.; Wang, T.; Lv, K.; Guo, T.; Jiang, J.; Hu, W. Research on a Visual/Ultra-Wideband Tightly Coupled Fusion Localization Algorithm. Sensors 2024, 24, 1710.

Abstract

In the autonomous navigation of mobile robots, precise positioning is crucial. In forest environments with weak satellite signals or in sites disturbed by complex environments, satellite positioning accuracy has difficulty in meeting the requirements of autonomous navigation positioning accuracy for robots. This article proposes a binocular vision SLAM/UWB tightly coupled localization method. The core of this algorithm is based on an Extended Kalman Filter (EKF), which utilizes the displacement increment output by binocular vision ORB-SLAM and the positioning coordinates calculated via UWB positioning as measurement information to achieve data fusion. The research utilized the constructed experimental platform to collect images and UWB ranging data in outdoor environments and experimentally validated the combined positioning method. The experimental results show that this algorithm outperforms the use of the individual UWB or visual SLAM localization methods in terms of positioning accuracy. Compared with traditional single positioning algorithms, this algorithm exhibits better practicality and stability, and can effectively suppress the impact of UWB non line of sight errors. The average error under normal line of sight conditions is 0.058 m, the root mean square error (RMSE) is 0.080, and the average error under non-line-of-sight conditions is 0.081 m, with an RMSE of 0.084. There is no significant jump or drift. Through the tightly coupled processing of binocular vision SLAM and UWB positioning, more reliable and accurate mobile robot positioning can be achieved, providing a useful reference for the autonomous navigation of mobile robots in complex environments.

Keywords

Extended Kalman Filtering; multi sensor fusion; visual SLAM; ultra-wide-band positioning

Subject

Computer Science and Mathematics, Other

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.