Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

DM-SLAM: Monocular SLAM in Dynamic Environments

Version 1 : Received: 11 January 2020 / Approved: 12 January 2020 / Online: 12 January 2020 (15:12:52 CET)

How to cite: Lu, X.; Wang, H.; Tang, S.; Huang, H.; Li, C. DM-SLAM: Monocular SLAM in Dynamic Environments. Preprints 2020, 2020010123. https://doi.org/10.20944/preprints202001.0123.v1 Lu, X.; Wang, H.; Tang, S.; Huang, H.; Li, C. DM-SLAM: Monocular SLAM in Dynamic Environments. Preprints 2020, 2020010123. https://doi.org/10.20944/preprints202001.0123.v1

Abstract

Many classic visual monocular SLAM systems have been developed over the past decades, however, most of them will fail when dynamic scenarios dominate. DM-SLAM is proposed for handling dynamic objects in environments based on ORB-SLAM. The article mainly concentrates on two aspects. Firstly, DLRSAC is proposed to extract static features from the dynamic scene based on awareness of nature difference between motion and static, which is integrated into initialization of DM-SLAM. Secondly, we design candidate map points selection mechanism based on neighborhood mutual exclusion to balance the accuracy of tracking camera pose and system robustness in motion scenes. Finally, we conduct experiments in the public dataset and compare DM-SLAM with ORB-SLAM. The experiments verify the superiority of the DM-SLAM.

Keywords

static features extraction; dynamic environments; 3D reconstruction; monocular SLAM

Subject

Computer Science and Mathematics, Computer Science

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.