Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Multimodal Fusion with Multiple Attention Mechanisms for 3D Target Detection Algorithm

Version 1 : Received: 26 July 2023 / Approved: 27 July 2023 / Online: 28 July 2023 (12:49:30 CEST)

How to cite: Zhang, X.; He, L.; Chen, J.; Wang, B.; Wang, Y.; Zhou, Y. Multimodal Fusion with Multiple Attention Mechanisms for 3D Target Detection Algorithm. Preprints 2023, 2023071956. https://doi.org/10.20944/preprints202307.1956.v1 Zhang, X.; He, L.; Chen, J.; Wang, B.; Wang, Y.; Zhou, Y. Multimodal Fusion with Multiple Attention Mechanisms for 3D Target Detection Algorithm. Preprints 2023, 2023071956. https://doi.org/10.20944/preprints202307.1956.v1

Abstract

This paper proposes a multimodal fusion 3D target detection algorithm based on the attention mechanism to improve the performance of 3D target detection. The algorithm utilizes point cloud data and information from camera. For image feature extraction, the ResNet50+FPN architecture extracts features at four levels. Point cloud feature extraction employs the voxel method and FCN to extract point and voxel features. The fusion of image and point cloud features is achieved through regional point fusion and regional voxel fusion methods. After information fusion, the Coordinate attention mechanism and SimAM attention mechanism extract fusion features at a deep level. The algorithm's performance is evaluated using the DAIR-V2X dataset. The results show that compared to the Part-A2 algorithm, the proposed algorithm improves the mAP value by 7.9% in BEV view and 7.8% in 3D view at IOU=0.5 (cars) and IOU=0.25 (pedestrians and cyclist). At IOU=0.7 (cars) and IOU=0.5 (pedestrians and cyclist), the mAP value of the SECOND algorithm is improved by 5.4% in the BEV view and 4.3% in the 3D view, compared to other comparison algorithms.

Keywords

Multimodal fusion; Attention mechanism; 3D target detection; Deep learning

Subject

Engineering, Automotive Engineering

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.