Version 1
: Received: 1 May 2024 / Approved: 1 May 2024 / Online: 1 May 2024 (09:39:14 CEST)
How to cite:
Qu, S.; Cui, J.; Fu, Y.; Cao, Z.; Qiao, Y.; Men, X. Position Estimation Method for Small Drones Based on Fusion of Multisource Multimodal Data and Digital Twins. Preprints2024, 2024050072. https://doi.org/10.20944/preprints202405.0072.v1
Qu, S.; Cui, J.; Fu, Y.; Cao, Z.; Qiao, Y.; Men, X. Position Estimation Method for Small Drones Based on Fusion of Multisource Multimodal Data and Digital Twins. Preprints 2024, 2024050072. https://doi.org/10.20944/preprints202405.0072.v1
Qu, S.; Cui, J.; Fu, Y.; Cao, Z.; Qiao, Y.; Men, X. Position Estimation Method for Small Drones Based on Fusion of Multisource Multimodal Data and Digital Twins. Preprints2024, 2024050072. https://doi.org/10.20944/preprints202405.0072.v1
APA Style
Qu, S., Cui, J., Fu, Y., Cao, Z., Qiao, Y., & Men, X. (2024). Position Estimation Method for Small Drones Based on Fusion of Multisource Multimodal Data and Digital Twins. Preprints. https://doi.org/10.20944/preprints202405.0072.v1
Chicago/Turabian Style
Qu, S., Yongxing Qiao and Xuemeng Men. 2024 "Position Estimation Method for Small Drones Based on Fusion of Multisource Multimodal Data and Digital Twins" Preprints. https://doi.org/10.20944/preprints202405.0072.v1
Abstract
In response to the issues of sensor noise and accumulated motion errors that lead to low positioning accuracy and insufficient robustness during large maneuvers or continuous flight of small unmanned aerial vehicles (UAVs) in complex environments, this paper proposes a multi-source and multimodal data fusion method. Initially, it employs a multimodal data fusion of various sensors including GPS (Global Positioning System), IMU (Inertial Measurement Unit), and visual sensors to complement the strengths and weaknesses of each hardware component, thereby eliminating motion errors to enhance accuracy. To mitigate the impact of sudden changes in sensor data, the method introduces a digital twin UAV data source. By utilizing the Extended Kalman Filter algorithm, it fuses data from both the real UAV and its digital twin, and the filtered positional information is fed back into the control system of the real UAV. This enables real-time correction of UAV positional deviations caused by sensor noise and environmental disturbances. The multi-source multimodal fusion Kalman filter method proposed in this paper significantly improves the positioning accuracy of UAVs in complex scenarios and the overall stability of the system. This method holds significant value in maintaining high-precision positioning in variable environments and has important practical implications for enhancing UAV navigation and application efficiency.
Keywords
UAV; Multisource Multimodal Data Fusion; Digital Twins; Position Estimation
Subject
Computer Science and Mathematics, Robotics
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.