Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

A Novel Method to Generate Auto-Labeled Datasets for 3D Vehicle Identification Using a New Contrast Model

Version 1 : Received: 25 February 2023 / Approved: 27 February 2023 / Online: 27 February 2023 (04:12:52 CET)

A peer-reviewed article of this Preprint also exists.

Gutierrez-Cabello, G.S.; Talavera, E.; Iglesias, G.; Clavijo, M.; Jiménez, F. A Novel Method to Generate Auto-Labeled Datasets for 3D Vehicle Identification Using a New Contrast Model. Appl. Sci. 2023, 13, 4334. Gutierrez-Cabello, G.S.; Talavera, E.; Iglesias, G.; Clavijo, M.; Jiménez, F. A Novel Method to Generate Auto-Labeled Datasets for 3D Vehicle Identification Using a New Contrast Model. Appl. Sci. 2023, 13, 4334.

Abstract

Auto-labeling is one of the main challenges in 3D vehicle detection. Auto-labeled datasets can be used to identify objects in LiDAR data, which is a challenging task due to the large size of the dataset. In this work, we propose a novel methodology to generate new 3D based auto-labeling datasets with a different point of view setup than the one used in the most recognized datasets (KITTI, WAYMO, etc.). The performance of the methodology has been further demonstrated with the development of our own dataset with the auto-generated labels and tested under boundary conditions on a bridge in a fixed position. The proposed methodology is based on the YOLO model trained with the KITTI dataset. From a camera-LiDAR sensory fusion, it is intended to auto-label new datasets while maintaining the consistency of the Ground Truth. The main contribution of this work is a novel methodology to auto-label autonomous driving datasets using YOLO as the main labelling system. The performance of this approach is measured retraining the contrast models of the KITTI benchmark.

Keywords

Auto-labeled; LiDAR, Point of View, Deep Learning

Subject

Engineering, Automotive Engineering

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.