Preprint Article Version 1 This version is not peer-reviewed

Real-world Comparison of Visual Odometry Methods

Version 1 : Received: 11 May 2020 / Approved: 13 May 2020 / Online: 13 May 2020 (04:45:54 CEST)

A peer-reviewed article of this Preprint also exists.

Alapetite, A.; Wang, Z.; Hansen, J.P.; Zajączkowski, M.; Patalan, M. Comparison of Three Off-The-Shelf Visual Odometry Systems. Robotics 2020, 9, 56. Alapetite, A.; Wang, Z.; Hansen, J.P.; Zajączkowski, M.; Patalan, M. Comparison of Three Off-The-Shelf Visual Odometry Systems. Robotics 2020, 9, 56.

Journal reference: Robotics 2020, 9, 56
DOI: 10.3390/robotics9030056

Abstract

Positioning is an essential aspect of robot navigation, and visual odometry an important technique for continuous updating the internal information about robot position, especially indoors without GPS. Visual odometry is using one or more cameras to find visual clues and estimate robot movements in 3D relatively. Recent progress has been made, especially with fully integrated systems such as the RealSense T265 from Intel, which is the focus of this article. We compare between each other three visual odometry systems and one wheel odometry, on a ground robot. We do so in 8 scenarios, varying the speed, the number of visual features, and with or without humans walking in the field of view. We continuously measure the position error in translation and rotation thanks to a ground truth positioning system. Our result show that all odometry systems are challenged, but in different ways. In average, ORB-SLAM2 has the poorer results, while the RealSense T265 and the Zed Mini have comparable performance. In conclusion, a single odometry system might still not be sufficient, so using multiple instances and sensor fusion approaches are necessary while waiting for additional research and further improved products.

Subject Areas

odometry; camera; positioning; navigation; indoor; robot

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our diversity statement.

Leave a public comment
Send a private comment to the author(s)
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.