Submitted:
04 June 2024
Posted:
05 June 2024
You are already at the latest version
Abstract
Keywords:
1. Summary
2. Materials and Methods
2.1. Related Work
2.1.1. Lasers and LiDARs on Bicycles
2.1.2. Autonomous Driving Data Sets
2.1.3. 3D Object Detection
2.2. Data Collection
2.3. Data Preparation
2.4. Metadata Generation
2.5. 3D Object Detection Benchmark
3. Results
4. Discussion
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
| LiDAR | Light Detection and Ranging |
| NDS | nuScenes Detection Score |
| mAP | Mean Average Precision |
| VISTA | Dual Cross-VIew SpaTial Attention |
| FOV | Field of View |
| CBGS | Class-balanced Grouping and Sampling |
| 1 | |
| 2 | |
| 3 | For detailed description see: https://www.nuscenes.org/nuscenes#data-format
|
| 4 |
References
- Lindsay, G.; Macmillan, A.; Woodward, A. Moving urban trips from cars to bicycles: impact on health and emissions. Australian and New Zealand Journal of Public Health 2011, 35, 54–60. [Google Scholar] [CrossRef] [PubMed]
- Buehler, R.; Pucher, J.R. (Eds.) Cycling for sustainable cities; Urban and industrial environments, The MIT Press: Cambridge, Massachusetts London, England, 2021. [Google Scholar]
- European Union. European Declaration on Cycling, 2023.
- Bundesministerium für Klimaschutz, Umwelt, Energie, Mobilität, Innovation und Technologie. Radverkehrsförderung in Österreich. Technical report, 2022.
- Wegman, F.; Zhang, F.; Dijkstra, A. How to make more cycling good for road safety? Accident Analysis & Prevention 2012, 44, 19–29. [Google Scholar] [CrossRef] [PubMed]
- European Transport Safety Council. How safe is walking and cycling in Europe? PIN Flash Report 38. Technical report, 2020.
- Beck, B.; Perkins, M.; Olivier, J.; Chong, D.; Johnson, M. Subjective experiences of bicyclists being passed by motor vehicles: The relationship to motor vehicle passing distance. Accident Analysis & Prevention 2021, 155, 106102. [Google Scholar] [CrossRef] [PubMed]
- Merk, J.; Eckart, J.; Zeile, P. Subjektiven Verkehrsstress objektiv messen – ein EmoCycling-Mixed-Methods-Ansatz. CITIES 20.50 – Creating Habitats for the 3rd Millennium: Smart – Sustainable – Climate Neutral. Proceedings of REAL CORP 2021, 26th International Conference on Urban Development, Regional Planning and Information Society, 2021, pp. 767–778.
- Beck, B.; Chong, D.; Olivier, J.; Perkins, M.; Tsay, A.; Rushford, A.; Li, L.; Cameron, P.; Fry, R.; Johnson, M. How Much Space Do Drivers Provide When Passing Cyclists? Understanding the Impact of Motor Vehicle and Infrastructure Characteristics on Passing Distance. Accident Analysis & Prevention 2019, 128, 253–260. [Google Scholar] [CrossRef] [PubMed]
- López, G.; Pérez-Zuriaga, A.M.; Moll, S.; García, A. Analysis of Overtaking Maneuvers to Cycling Groups on Two-Lane Rural Roads using Objective and Subjective Risk. Transportation Research Record: Journal of the Transportation Research Board 2020, 2674, 148–160. [Google Scholar] [CrossRef]
- Moll, S.; López, G.; Rasch, A.; Dozza, M.; García, A. Modelling Duration of Car-Bicycles Overtaking Manoeuvres on Two-Lane Rural Roads Using Naturalistic Data. Accident Analysis & Prevention 2021, 160, 106317. [Google Scholar] [CrossRef] [PubMed]
- Dozza, M.; Schindler, R.; Bianchi-Piccinini, G.; Karlsson, J. How Do Drivers Overtake Cyclists? Accident Analysis & Prevention 2016, 88, 29–36. [Google Scholar] [CrossRef] [PubMed]
- Rasch, A. Modelling Driver Behaviour in Longitudinal Vehicle-Pedestrian Scenarios. Master’s Thesis, Chalmers University of Technology, Göteborg, 2018. [Google Scholar]
- Rasch, A.; Boda, C.N.; Thalya, P.; Aderum, T.; Knauss, A.; Dozza, M. How Do Oncoming Traffic and Cyclist Lane Position Influence Cyclist Overtaking by Drivers? Accident Analysis & Prevention 2020, 142, 105569. [Google Scholar] [CrossRef] [PubMed]
- Rasch, A.; Dozza, M. Modeling Drivers’ Strategy When Overtaking Cyclists in the Presence of Oncoming Traffic. IEEE Transactions on Intelligent Transportation Systems 2022, 23, 2180–2189. [Google Scholar] [CrossRef]
- Caesar, H.; Bankiti, V.; Lang, A.H.; Vora, S.; Liong, V.E.; Xu, Q.; Krishnan, A.; Pan, Y.; Baldan, G.; Beijbom, O. nuScenes: A multimodal dataset for autonomous driving, 2020. arXiv:1903.11027 [cs, stat].
- Jeon, W.; Rajamani, R. A novel collision avoidance system for bicycles. 2016 American Control Conference (ACC); IEEE: Boston, MA, USA, 2016; pp. 3474–3479. [Google Scholar] [CrossRef]
- Jeon, W.; Rajamani, R. Active Sensing on a Bicycle for Accurate Tracking of Rear Vehicle Maneuvers. Volume 2: Mechatronics; Mechatronics and Controls in Advanced Manufacturing; Modeling and Control of Automotive Systems and Combustion Engines; Modeling and Validation; Motion and Vibration Control Applications; Multi-Agent and Networked Systems; Path Planning and Motion Control; Robot Manipulators; Sensors and Actuators; Tracking Control Systems; Uncertain Systems and Robustness; Unmanned, Ground and Surface Robotics; Vehicle Dynamic Controls; Vehicle Dynamics and Traffic Control. American Society of Mechanical Engineers, 2016, p. V002T31A004. event-place: Minneapolis, Minnesota, USA, doi:10.1115/DSCC2016-9772.
- Jeon, W.; Rajamani, R. Rear Vehicle Tracking on a Bicycle Using Active Sensor Orientation Control. IEEE Transactions on Intelligent Transportation Systems 2018, 19, 2638–2649. [Google Scholar] [CrossRef]
- Jeon, W.; Rajamani, R. Active Sensing on a Bicycle for Simultaneous Search and Tracking of Multiple Rear Vehicles. IEEE Transactions on Vehicular Technology 2019, 68, 5295–5308. [Google Scholar] [CrossRef]
- Jeon, W.; Xie, Z.; Craig, C.; Achtemeier, J.; Alexander, L.; Morris, N.; Donath, M.; Rajamani, R. A Smart Bicycle That Protects Itself: Active Sensing and Estimation for Car-Bicycle Collision Prevention. IEEE Control Systems Magazine 2021, 41, 28–57, Publisher:IEEE. [Google Scholar] [CrossRef]
- Xie, Z.; Rajamani, R. On-Bicycle Vehicle Tracking at Traffic Intersections Using Inexpensive Low-Density Lidar. 2019 American Control Conference (ACC). IEEE, 2019, pp. 593–598. Place: Philadelphia, PA, USA, doi:10.23919/ACC.2019.8814442.
- Xie, Z.; Jeon, W.; Rajamani, R. Low-Density Lidar Based Estimation System for Bicycle Protection. IEEE Transactions on Intelligent Vehicles 2021, 6, 67–77. [Google Scholar] [CrossRef]
- Van Brummelen, J.; Emran, B.; Yesilcimen, K.; Najjaran, H. Reliable and Low-Cost Cyclist Collision Warning System for Safer Commute on Urban Roads. 2016 IEEE International Conference on Systems, Man, and Cybernetics (SMC). IEEE, 2016, pp. 003731–003735.
- Muro, S.; Matsui, Y.; Hashimoto, M.; Takahashi, K. Moving-Object Tracking with Lidar Mounted on Two-wheeled Vehicle: Proceedings of the 16th International Conference on Informatics in Control, Automation and Robotics. SCITEPRESS—Science and Technology Publications, 2019, pp. 453–459. Place: Prague, Czech Republic, doi:10.5220/0007948304530459.
- Niedermüller, A.; Beeking, M. Transformer Based 3D Semantic Segmentation of Urban Bicycle Infrastructure. Journal of Location Based Services 2024, pp. 1–23. [CrossRef]
- Vogt, J.; Ilic, M.; Bogenberger, K. A Mobile Mapping Solution for VRU Infrastructure Monitoring via Low-Cost LiDAR-sensors. Journal of Location Based Services 2023, pp. 1–23. [CrossRef]
- Yoshida, N.; Yamanaka, H.; Matsumoto, S.; Hiraoka,T.; Kawai,Y.; Kojima, A.; Inagaki, T. Development of Safety Measures of Bicycle Traffic by Observation with Deep-Learning, Drive Recorder Data, Probe Bicycle with LiDAR, and Connected Simulators. 2022.
- Geiger, A.; Lenz, P.; Urtasun, R. Are we Ready for Autonomous Driving? The KITTI Vision Benchmark Suite. 2012 IEEE Conference on Computer Vision and Pattern Recognition; IEEE: Providence, RI, 2012; pp. 3354–3361. [Google Scholar] [CrossRef]
- Sun, P.; Kretzschmar, H.; Dotiwalla, X.; Chouard, A.; Patnaik, V.; Tsui, P.; Guo, J.; Zhou, Y.; Chai, Y.; Caine, B.; Vasudevan, V.; Han, W.; Ngiam, J.; Zhao, H.; Timofeev, A.; Ettinger, S.; Krivokon, M.; Gao, A.; Joshi, A.; Zhao, S.; Cheng, S.; Zhang, Y.; Shlens, J.; Chen, Z.; Anguelov, D. Scalability in Perception for Autonomous Driving: Waymo Open Dataset, 2020. arXiv:1912.04838 [cs, stat]. arXiv:1912.04838 [cs, stat].
- Everingham, M.; Van Gool, L.; Williams, C.K.I.; Winn, J.; Zisserman, A. The Pascal Visual Object Classes (VOC) Challenge. International Journal of Computer Vision 2010, 88, 303–338. [Google Scholar] [CrossRef]
- Mao, J.; Shi, S.; Wang, X.; Li, H. 3D Object Detection for Autonomous Driving: A Review and New Outlooks, 2022. arXiv:2206.09474 [cs].
- Wu, Y.; Wang, Y.; Zhang, S.; Ogai, H. Deep 3D Object Detection Networks Using LiDAR Data: A Review. IEEE Sensors Journal 2021, 21, 1152–1171. [Google Scholar] [CrossRef]
- Liang, W.; Xu, P.; Guo, L.; Bai, H.; Zhou, Y.; Chen, F. A survey of 3D object detection. Multimedia Tools and Applications 2021, 80, 29617–29641. [Google Scholar] [CrossRef]
- Zamanakos, G.; Tsochatzidis, L.; Amanatiadis, A.; Pratikakis, I. A comprehensive survey of LIDAR-based 3D object detection methods with deep learning for autonomous driving. Computers & Graphics 2021, 99, 153–181. [Google Scholar] [CrossRef]
- Fernandes, D.; Silva, A.; Névoa, R.; Simões, C.; Gonzalez, D.; Guevara, M.; Novais, P.; Monteiro, J.; Melo-Pinto, P. Point-cloud based 3D object detection and classification methods for self-driving applications: A survey and taxonomy. Information Fusion 2021, 68, 161–191. [Google Scholar] [CrossRef]
- Deng, S.; Liang, Z.; Sun, L.; Jia, K. VISTA: Boosting 3D Object Detection via Dual Cross-VIew SpaTial Attention, 2022. arXiv:2203.09704 [cs].
- Graham, B. Spatially-sparse convolutional neural networks, 2014. arXiv:1409.6070 [cs].
- Graham, B. Sparse 3D convolutional neural networks, 2015. arXiv:1505.02890 [cs].
- Graham, B.; van der Maaten,L. Submanifold Sparse Convolutional Networks, 2017. arXiv:1706.01307[cs].
- Chen, Q.; Sun, L.; Cheung, E.; Yuille, A.L. Every View Counts: Cross-View Consistency in 3D Object Detection with Hybrid-Cylindrical-Spherical Voxelization. Advancesin Neural Information Processing Systems; Larochelle, H.; Ranzato, M.; Hadsell, R.; Balcan, M.F.; Lin, H., Eds. Curran Associates,Inc., 2020, Vol.33, pp.21224–21235.
- Chen, Q.; Sun, L.; Wang, Z.; Jia, K.; Yuille, A. Object as Hotspots: An Anchor-Free 3D Object Detection Approach via Firing of Hotspots, 2020. arXiv:1912.12791 [cs].
- Livox Tech. Livox Horizon: User Manual v1.0. Technical report, 2019.
- Li, E.; Wang, S.; Li, C.; Li, D.; Wu, X.; Hao, Q. SUSTech POINTS: A Portable 3D Point Cloud Interactive Annotation Platform System. 2020 IEEE Intelligent Vehicles Symposium (IV); IEEE: Las Vegas, NV, USA, 2020; pp. 1108–1115. [Google Scholar] [CrossRef]
- Zhu, B.; Jiang, Z.; Zhou, X.; Li, Z.; Yu, G. Class-balanced Grouping and Sampling for Point Cloud 3D Object Detection, 2019. arXiv:1908.09492 [cs].
| Table Name | Content in nuScenes | Changes in SaBi3d |
|---|---|---|
| attribute | Possible properties of instances, e.g. being parked or moving. | None; only one line is used (vehicle.moving). |
| category | Object categories and subcategories. | None; only one line is used (vehicle.car). |
| visibility | Fraction of annotation that is visible. | None; only one line is used (4; corresponding to 80-100% visibility). |
| calibrated_sensor | Definition of sensors, including their orientation. | Position of sensor adjusted to correct height. |
| sensor | List of sensors. | Sensor name changed. |
| log | Information about the log files of the recording. | Empty file since no corresponding log files were recorded by the sensors. |
| map | File paths to the respective map images. | Corresponding file changed to solid black PNG since no map data was recorded. Was used only for rendering images and not needed for object detection. |
| ego_pose | Ego vehicle positions with respect to a global coordinate system. | All values set to zero since no ego location was recorded. Positions of detections are therefore relative to the bicycle and not to a global coordinate system. |
| sample | References to the frames that were annotated at 2 Hz (sampling frequency: 10 Hz). | Adjusted to the data. Every frame is annotated at a sampling frequency of 5 Hz. |
| sample_data | Paths to data files of the samples (LiDAR, Radar, image). | Adjusted to the data; only paths to LiDAR files. |
| scene | One entry for every scene. | Adjusted to the data. |
| instance | One entry for every unique vehicle (a particular vehicle might appear in multiple frames). | Adjusted to the data. |
| sample_annotation | Cuboid bounding boxes indicating the position and properties of objects. | Adjusted to the data by transforming the output of the labeling tool appropriately. Coordinates were transformed from Euler angles to quaternions. Visibility was not annotated and set to 4 for every annotation. The number of LiDAR points contained in the cuboid was not annotated and set to a reasonable average of 1000. |
| Model Training | mAP (car) |
|---|---|
| Provided checkpoint [37] | 85.0 |
| Replicated, without resampling | 53.6 |
| Replicated, with resampling | 85.5 |
| Cars only | 58.1 |
| Model Training | mAP (car) |
|---|---|
| Provided checkpoint [37] | 0.3 |
| Fine-tuning checkpoint on nuScenes cars | 12.6 |
| Fine-tuning checkpoint on SaBi3d | 80.2 |
| Training on SaBi3d | 79.1 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).