Submitted:
20 June 2025
Posted:
24 June 2025
You are already at the latest version
Abstract
Keywords:
1. Introduction
2. Related Work
2.1. Multi-Sensor Fusion in Aerospace
2.2. Deep Learning Fusion Models
2.3. Hardware-Software Co-Design
3. Methodology
3.1. Fusion Algorithm Overview
3.2. Cross-Attention Encoding
3.3. Latent Self-Attention and Feedforward
3.4. Decoding and Output Generation
3.5. Determinism and Quantization for Real-Time
3.6. Certifiability Measures
3.7. Hardware-Software Architecture
3.8. Hardware Certifiability Considerations
3.9. Software Certifiability Considerations
4. Results
4.1. Simulation Setup
4.2. Accuracy and Latency

4.3. Robustness Tests
4.4. Adversarial Inputs
4.5. Confidence Interval Stability
4.6. Case Study: Multi-UAM Conflict Scenario
5. Discussion
5.1. Advantages of Transformer-Based Fusion
5.2. Comparative to BART and Other Models
5.3. Certifiability and Safety Considerations
5.4. Cybersecurity and Adversarial Defense
6. Limitations and Future Work
6.1. Model Size vs. Determinism
6.2. Certification Process
6.3. Use Case Extension
7. Conclusion
Appendix A: Code Implementation
|
import
os import matplotlib.pyplot as plt from mpl_toolkits.mplot3d import Axes3D import seaborn as sns import numpy as np # Create output directory output_dir = "/content/fusion_eval_figures" os.makedirs(output_dir, exist_ok=True) # ------------------------------ # FIGURE 1: Attention Heatmap # ------------------------------ attention_weights = np.random.rand(10, 10) plt.figure(figsize=(6, 5)) sns.heatmap(attention_weights, cmap='viridis', annot=False, cbar_kws={"label": "Attention Weight"}) plt.title('Figure 1: Attention Heatmap', fontsize=14) plt.xlabel('Key', fontsize=12) plt.ylabel('Query', fontsize=12) plt.tight_layout() plt.savefig(f"{output_dir}/figure1_attention_heatmap.png", dpi=300) plt.show() # ------------------------------ # FIGURE 2: UAV 2D Trajectory # ------------------------------ x = list(range(8)) y_true = [0.0, 0.4, 0.8, 1.2, 1.6, 2.0, 2.4, 2.8] y_pred = [0.0, 0.5, 1.5, 1.0, 1.8, 2.0, 2.5, 3.0] plt.figure(figsize=(6, 5)) plt.plot(x, y_true, 'k--o', label='Ground Truth', linewidth=1.5) plt.plot(x, y_pred, 'b-s', label='Transformer Estimate', linewidth=1.5) for i in range(len(x)): plt.annotate(str(i), (x[i]+0.1, y_true[i]+0.1), fontsize=9) plt.title('Figure 2: UAV Y-Position over Waypoints', fontsize=14) plt.xlabel('Waypoint Index', fontsize=12) plt.ylabel('Y Position (m)', fontsize=12) plt.legend() plt.grid(True, linestyle='--', linewidth=0.5) plt.tight_layout() plt.savefig(f"{output_dir}/figure2_trajectory_2d.png", dpi=300) plt.show() # ------------------------------ # FIGURE 3: UAV 3D Trajectory # ------------------------------ z = [0, 0.2, 0.4, 0.6, 0.8, 1.0, 1.2, 1.4] fig = plt.figure(figsize=(8, 6)) ax = (Figureadd_subplot(111, projection='3d') ax.plot(x, y_true, z, marker='o', color='blue', linewidth=1.5) for i in range(len(x)): ax.text(x[i]+0.1, y_true[i]+0.1, z[i]+0.1, str(i), fontsize=9) ax.set_title("Figure 3: UAV 3D Trajectory with Waypoints", fontsize=14, pad=20) ax.set_xlabel("X Position (m)", fontsize=12, labelpad=10) ax.set_ylabel("Y Position (m)", fontsize=12, labelpad=10) ax.set_zlabel("Z Position (m)", fontsize=12, labelpad=10) (Figuresubplots_adjust(left=0.1, right=0.9, bottom=0.1, top=0.88) plt.savefig(f"{output_dir}/figure3_trajectory_3d.png", dpi=300) plt.show() # ------------------------------ # FIGURE 4: GNSS Dropout Error # ------------------------------ t = np.linspace(0, 20, 81) error_trans = np.zeros_like(t) error_ekf = np.zeros_like(t) for i, ti in enumerate(t): if ti < 10: error_trans[i] = 0.3 error_ekf[i] = 0.3 elif ti < 15: error_trans[i] = 0.3 + (ti-10)*0.2 error_ekf[i] = 0.3 + (ti-10)*1.0 else: error_trans[i] = max(0.3, 1.3 - (ti-15)*0.5) if ti == 15: error_ekf[i] = 5.3 else: error_ekf[i] = 1.0 + 0.5*np.sin(2*(ti-15)) + 0.2*(ti-15) plt.figure(figsize=(7, 4.5)) plt.plot(t, error_trans, 'b-', label='Transformer Fusion', linewidth=1.8) plt.plot(t, error_ekf, 'r--', label='Legacy EKF', linewidth=1.8) plt.axvspan(10, 15, color='gray', alpha=0.3, label='GNSS Dropout') plt.title('Figure 4: Position Error Under GNSS Dropout Scenario', fontsize=14) plt.xlabel('Time (s)', fontsize=12) plt.ylabel('Position Error (m)', fontsize=12) plt.legend() plt.grid(True, linestyle='--', linewidth=0.5) plt.tight_layout() plt.savefig(f"{output_dir}/figure4_gnss_dropout_error.png", dpi=300) plt.show() # ------------------------------ # FIGURE 6: MSE vs Latency (WITH STD DEV + Annotation) # ------------------------------ models = ['BART', 'CNN-RNN', 'Transformer'] mse = [0.92, 0.65, 0.38] mse_std = [0.08, 0.05, 0.02] latency = [180, 90, 105] latency_std = [15, 10, 8] x_vals = np.arange(len(models)) width = 0.35 fig, ax1 = plt.subplots(figsize=(8, 5)) # MSE Bars (skyblue) bar1 = ax1.bar(x_vals - width/2, mse, width, yerr=mse_std, capsize=5, label='MSE', color='skyblue', ecolor='skyblue') ax1.set_ylabel('Mean Squared Error', fontsize=12) ax1.set_xlabel('Model Architecture', fontsize=12) ax1.set_xticks(x_vals) ax1.set_xticklabels(models) ax1.set_ylim(0, max(mse[i] + mse_std[i]for i in range(3)) + 0.1) # LATENCY Bars (salmon) ax2 = ax1.twinx() bar2 = ax2.bar(x_vals + width/2, latency, width, yerr=latency_std, capsize=5, label='Latency (ms)', color='salmon', ecolor='salmon') ax2.set_ylabel('Latency (ms)', fontsize=12) ax2.set_ylim(0, max(latency[i] + latency_std[i] for i in range(3)) + 20) # Add std dev values as text (optional clarity) for i in range(len(models)): ax1.text(x_vals[i] - width/2, mse[i] + mse_std[i] + 0.03, f"±{mse_std[i]:.2f}", ha='center', fontsize=9, color='blue') ax2.text(x_vals[i] + width/2, latency[i] + latency_std[i] + 5, f"±{latency_std[i]}", ha='center', fontsize=9, color='darkred') # Title and legend ax1.set_title('Figure 6: Performance Comparison of Sensor Fusion Models', fontsize=14) (Figurelegend(loc='upper right', bbox_to_anchor=(1, 1), bbox_transform=ax1.transAxes) plt.tight_layout() plt.savefig("/content/fusion_eval_figures/figure6_comparison_styled.png", dpi=300) plt.show() # ------------------------------ # FIGURE 7: 3D Trajectory under Sensor Dropout # ------------------------------ t = np.linspace(0, 2*np.pi, 100) x = np.cos(t) y = np.sin(t) z = t eo_y = y + 0.2*np.sin(3*t) lidar_y = y - 0.2*np.cos(2*t) fig = plt.figure(figsize=(9, 6)) ax = (Figureadd_subplot(111, projection='3d') ax.plot(x, y, z, label='Transformer Fusion', color='orange', linewidth=2) ax.plot(x, eo_y, z, label='EO Only', linestyle='--', color='red') ax.plot(x, lidar_y, z, label='LiDAR Only', linestyle=':', color='purple') ax.set_xlabel('X Position (m)', fontsize=12, labelpad=10) ax.set_ylabel('Y Position (m)', fontsize=12, labelpad=10) ax.set_zlabel('Altitude (m)', fontsize=12, labelpad=10) ax.set_title('Figure 7: eVTOL 3D Trajectory Recovery under Sensor Dropout', fontsize=14, pad=20) ax.legend() (Figuresubplots_adjust(left=0.1, right=0.9, bottom=0.1, top=0.88) plt.savefig(f"{output_dir}/figure7_dropout_trajectory.png", dpi=300) plt.show() |
References
- Y. Bar-Shalom, X. R. Li, and T. Kirubarajan, Estimation with Applications to Tracking and Navigation, John Wiley & Sons, Hoboken, NJ, 2001. [CrossRef]
- J. Chen, Z. Huang, X. Wang, and Y. Zhang, “Deep Transformer Networks for Sensor Fusion in Autonomous Systems,” IEEE Transactions on Neural Networks and Learning Systems, vol. 34, no. 1, pp. 212–224, 2023.
- Y. Wang, Q. Zhou, and P. Li, “Sensor Fusion Using Deep Learning: A Survey,” IEEE Access, vol. 9, pp. 58222–58235, 2021.
- M. T. Wolf, A. Rashid, and D. R. Jefferson, “Securing Avionics Architectures: Cybersecurity Requirements and Methods,” IEEE Aerospace and Electronic Systems Magazine, vol. 34, no. 2, pp. 22–32, 2019.
- P. Koopman and M. Wagner, “Challenges in Autonomous Vehicle Testing and Validation,” SAE International Journal of Transportation Safety, vol. 4, no. 1, pp. 15–24, 2016. [CrossRef]
- J. Redmon and A. Farhadi, “YOLOv3: An Incremental Improvement,” Proc. of the IEEE Conf. on Computer Vision and Pattern Recognition Workshops, 2018. [CrossRef]
- RTCA, DO-178C: Software Considerations in Airborne Systems and Equipment Certification, RTCA Inc., Washington, D.C., Dec. 2011. https://www.rtca.org.
- RTCA, DO-254: Design Assurance Guidance for Airborne Electronic Hardware, RTCA Inc., Washington, D.C., Apr. 2000. https://www.rtca.org.
- RTCA/EUROCAE, DO-326A/ED-202A: Airworthiness Security Process Specification, RTCA Inc., Washington, D.C., Jan. 2014. https://www.rtca.org.
- G. E. Hinton, A. Krizhevsky, and S. D. Wang, “Transforming Auto-Encoders,” Proceedings of the 2011 International Conference on Artificial Neural Networks (ICANN), Springer, Berlin, Heidelberg, pp. 44–51, 2011. [CrossRef]
- [11] J. Luo and M. A. Brzeski, “Deep Learning-Based Sensor Fusion Architectures for Guidance of Urban Air Mobility Aircraft,” (AIAA Scitech 2023 Forum), National Harbor, MD, 2023.
- https://doi.org/10.2514/6.2023-0458. [CrossRef]






Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).