Submitted:
18 March 2025
Posted:
19 March 2025
You are already at the latest version
Abstract

Keywords:
1. Introduction
- a)
- Virtual sensors depend only on the data from physical sensors. ESC (Electronic Stability Control) uses physical sensors like gyroscopes, accelerometers, wheel speed sensors, and virtual sensors to estimate yaw/slip angle, allowing the vehicle to maintain control in low-grip conditions or dangerous turns;
- b)
- Virtual sensors depend entirely on information from other virtual sensors. In the case of FCW (Forward Collision Warning), AEB (Automatic Emergency Braking), virtual sensor is used to predict the trajectory of the vehicle and evaluate the distance to other vehicles;
- c)
- Virtual sensors depend on data from both physical and virtual sensors. This configuration can be found in the DMS (Driver Monitoring System), which uses physical sensors like a video camera and/or pressure sensors in the steering wheel and/or seat, and virtual sensors like those for estimating the driver's level of attention and detecting the intention to leave the lane.
2. Classification of the Virtual Sensor
2.1. Virtual Sensor Model
- Camera sensors generate synthetic data on the recognition and classification of objects in the area [30,31,32], in addition to the vehicle's positioning and orientation relative to close to objects and V2V (Vehicle-to-Vehicle) communication [33] based on the VLC (Visible Light Communication) principle [34]. The advantages of the camera sensor include the ability to provide data in real time, low latency in data acquisition and processing, adaptability to extreme lighting conditions (low lighting, bright lighting), accurate estimation of object position and orientation, and low production and implementation costs. The constraints of camera sensors include the need for direct view of surrounding objects, susceptibility to unexpected changes in lighting conditions, and the need for greater computer capacity due to the large quantities of data that are constantly generated;
- Radar sensor generates data based on the reflection duration of radio waves ToF (Time of Flight) when detecting nearby target vehicles [35,36] and uses ML methods to estimate the current and future positions of nearby vehicles [37], respectively using DL (Deep Learning) methods to avoid collisions [38]. The benefits of radar sensors include the capacity to provide the location of target vehicles in real time, flexibility to severe weather conditions (rain, snow, fog), and low manufacturing and installation costs. The constraints of radar sensors include the requirement for increased computer capacity due to the massive volumes of data generated on a continuous basis, as well as a reliance on extra hardware systems and software;
- Lidar sensors provide a system based on generating a point cloud through 2D and 3D laser scanning for real-time localization of static and dynamic objects in proximity [39,40] and applies the YOLO (You Only Look Once) picture segmentation technique [41]. The advantages of lidar sensors include the ability to localize static and moving objects in proximity precisely. The disadvantages of lidar sensors include the need for greater computer power due to the large quantity of data generated continuously, sensitivity to bad weather conditions (rain, snow, fog), and high manufacturing and implementation costs.
- Sensor fidelity could be classified as high, medium, or low-income;
-
Method for collecting information from the environment:
- a)
- A deterministic strategy based on the simulation application's mathematical apparatus and involves the usage of a vast volume of input parameters to represent the ideal behavior and response of the virtual sensor as accurately as possible;
- b)
- A statistical technique based on statistical distribution functions, which include the normal, binomial, Poisson, or exponential distribution;
- c)
- The electromagnetic field propagation approach simulates electromagnetic wave propagation using Maxwell's equations.
- The objective of using sensors is to develop a vehicle's operating mode based on observed metrics and to perform diagnostics using AI-based maintenance techniques to define the smart maintenance regime.
2.1.1. Ideal Sensors
2.1.2. Hi-Fi Sensors
2.1.3. RSI Sensors
2.2. Virtual Vehicle Model
2.3. Virtual Environmental Model
3. Characteristics of the Virtual Sensor
3.1. Characteristics of Ideal Sensor
3.1.1. Slip Angle Sensor
3.1.2. Inertial Sensor
3.1.3. Object Sensor
- Nearest object, this is the closest visible object that is considered a relevant target;
- Nearest object in the path, this is the closest object within an interval of an estimated vehicle trajectory.
- Object ID a name or a code used to identify an object;
- Path direction (reference and closest point);
- Relative distance and velocity (between the reference and the nearest positions);
- Relative orientation in the axle x-y-z (the reference point);
- Sensor frame’s x-y-z distances (between the reference and the nearest point);
- Sensor frame’s x-y-z velocity (between the reference and the nearest point);
- Flag object has been identified (in the sensor viewing area);
- Flag object has been identified (in the observation area);
- Incidence angles between the sensor beam and the object being detected from proximity;
- Width, height, length of the object, and height above the ground.
3.1.4. Free Space Sensor
3.1.5. Traffic Sign Sensor
3.1.6. Line Sensor
3.1.7. Road Sensor
3.1.8. Object by Line Sensor
3.2. Characteristics of Hi-Fi Sensor
3.2.1. Camera Sensor
3.2.2. Global Navigation Sensor
3.2.3. Radar Sensor
3.3. Characteristics of RSI Sensor
3.3.1. Lidar RSI Sensor
- Diffuse reflected laser rays are distributed uniformly, regardless of the direction of the incident ray (Lambertian reflection), with the intensity of the reflected ray decreased by the inverse of the angle between the incident ray and the normal of the reflective surface;
- Retroreflective means that incident laser rays are reflected back in the direction of the emitter, with the intensity of the reflected ray being reduced based on the reflectance parameters and incident angle;
- Specular, the incident and reflected laser rays form identical angles with the normal of the reflective surface, and the incident and reflected laser rays are in the identical plane;
- Transmissive: the incident laser photons keep their course but are attenuated by transmission.
3.3.2. Ultrasonic RSI Sensor
- Direct Echo: the acoustic pressure wave is reflected once by an object in close proximity and received by the emitting sensor;
- Indirect Echo: the acoustic pressure wave is reflected at least twice by objects or surfaces in the vicinity and received by the emitting sensor;
- Repeated Echo: the emitting sensor receives the acoustic pressure wave after it has been reflected by nearby objects or surfaces;
- Cross Echo: the reflected acoustic pressure wave is received by a sensor other than the originating sensor, resulting in a propagation mode known as cross echo;
- Road Clutter occurs when the acoustic pressure wave is reflected by bumps or irregularities in the roadway.
4. Virtual Sensor Parametrization
4.1. Ideal Sensor Parametrization
4.2. Hi-Fi Sensor Parametrization
4.3. RSI Sensor Parametrization
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| ABS | Antilock Braking System |
| ADAS | Advanced Driver Assistance Systems |
| ACC | Adaptive Cruise Control |
| AD | Autonomous Driving |
| AEB | Automatic Emergency Braking |
| AI | Artificial Intelligence |
| AWD | All-Wheel Drive |
| BiSeNet | Bilateral Segmentation Network |
| CNN | Convolutional Neural Network |
| D-GNSS | Differential-Global Navigation Satellite System |
| CPU | Central Processing Unit |
| DL | Deep Learning |
| DMS | Driver Monitoring System |
| DPM | Deformable Part Model |
| DTw | Digital Twin |
| DVM | Double Validation Metric |
| EBA | Emergency Brake Assist |
| ECEF | Earth Centered Earth Fixed |
| EDF | Empirical cumulative Distribution Function |
| EM | Energy Management |
| ESC | Electronic Stability Control |
| FC | Fuel Consumption |
| FCW | Forward Collision Warning |
| FET | Field-Effect Transistor |
| FoV | Field-Of-View |
| GCS | Geographic Coordinate System |
| GDOP | Geometric Dilution of Precision |
| GNN | Global Nearest Neighbor |
| GNSS | Global Navigation Satellite System |
| GPS | Global Positioning System |
| GPU | Graphics Processing Unit |
| HD | High-Definition |
| Hi-Fi | High Fidelity |
| HiL | Hardware-in-the-Loop |
| HV | High Voltage |
| ID | IDentifier |
| ILA | Intelligent Light Assist |
| IMU | Inertial Measurement Unit |
| IoT | Internet of Things |
| LDW | Lane Departure Warning |
| Lidar | Light detection and ranging |
| LK | Lane Keeping |
| LKA | Lane Keeping Assist |
| LV | Low Voltage |
| MEMS | Micro-ElectroMechanical Systems |
| ML | Machine Learning |
| MMS | Mobile Mapping System |
| OTA | Over The Air |
| PA | Park Assist |
| PPHT | Progressive Probabilistic Hough Transform |
| POI | Point Of Interest |
| PS | Physical Sensors |
| PT | Powertrain |
| QTC | Quantum Tunnelling Composite |
| RCS | Radar Cross Section |
| RSI | Raw Signal Interface |
| RTK | Real-Time Kinematic |
| SAS | Smart Airbag System |
| SD | Sign Detection |
| SiL | Software-in-the-Loop |
| SLAM | Simultaneous Localization And Mapping |
| SNR | Signal-to-Noise Ratio |
| SPA | Sound Pressure Amplitude |
| SRTM | Shuttle Radar Topography Mission |
| ToF | Time of Flight |
| TSA | Traffic Sign Assist |
| V2I | Vehicle-to-Infrastructure |
| V2V | Vehicle-to-Vehicle |
| VDM | Vehicle Dynamic Model |
| VLC | Visible Light Communication |
| VS | Virtual Sensors |
| WLD | Wheel Lifting Detection |
| YOLO | You Only Look Once |
References
- Martin, D.; Kühl, N.; Satzger, G. Virtual Sensors. Bus. Inf. Syst. Eng. 2021, 63, 315–323. [CrossRef]
- Dahiya, R.; Ozioko, O.; Cheng, G.; Sensory Systems for Robotic Applications, Publisher: MIT Press, Cambridge, Massachusetts, 2022. [CrossRef]
- Šabanovič, E.; Kojis, P.; Šukevičius, Š.; Shyrokau, B.; Ivanov, V.; Dhaens, M.; Skrickij, V. Feasibility of a Neural Network-Based Virtual Sensor for Vehicle Unsprung Mass Relative Velocity Estimation. Sensors 2021, 21, 7139. [CrossRef]
- Persson, J.A.; Bugeja, J.; Davidsson, P.; Holmberg, J.; Kebande, V.R.; Mihailescu, R.-C.; Sarkheyli-Hägele, A.; Tegen, A. The Concept of Interactive Dynamic Intelligent Virtual Sensors (IDIVS): Bridging the Gap between Sensors, Services, and Users through Machine Learning. Appl. Sci. 2023, 13, 6516. [CrossRef]
- Ambarish, P.; Mitradip, B.; Ravinder, D. Solid-State Sensors (IEEE Press Series on Sensors), Publisher: Wiley-IEEE Press, 2023. [CrossRef]
- Shin, H.; Kwak, Y. Enhancing digital twin efficiency in indoor environments: Virtual sensor-driven optimization of physical sensor combinations, Automat. Constr. 2024, 161, 105326. [CrossRef]
- Stanley, M.; Lee, J. Sensor Analysis for the Internet of Things, Publisher: Morgan & Claypool Publishers, Arizona State University, 2018.
- Stetter, R. A Fuzzy Virtual Actuator for Automated Guided Vehicles. Sensors 2020, 20, 4154. [CrossRef]
- Xie, J.; Yang, R.; Gooi, H.B.; Nguyen, H. PID-based CNN-LSTM for accuracy-boosted virtual sensor in battery thermal management system, Appl. Energ. 2023, 331, 120424. [CrossRef]
- Fabiocchi, D.; Giulietti, N.; Carnevale, M.; Giberti, H. AI-Driven Virtual Sensors for Real-Time Dynamic Analysis of Me-chanisms: A Feasibility Study. Machines 2024, 12, 257. [CrossRef]
- Kabadayi, S.; Pridgen, A.; Julien, C. Virtual sensors: Abstracting data from physical sensors. In IEEE International Symposium on a World of Wireless, Mobile and Multimedia Networks, United States, Buffalo-Niagara Falls, NY, 26.06.2006-29.06.2006 (26 June 2006). [CrossRef]
- Compredict, Available online: https://compredict.ai/virtual-sensors-replacing-vehicle-hardware-sensors/ (Accessed February, 6 2025).
- Prokhorov, D. Virtual Sensors and Their Automotive Applications, In 2005 International Conference on Intelligent Sensors, Sensor Networks and Information Processing, Melbourne, VIC, Australia, 05-08 December 2005. [CrossRef]
- Forssell, U.; Ahlqvist, S.; Persson, N.; Gustafsson, F. Virtual Sensors for Vehicle Dynamics Applications. In: Krueger, S., Gessner, W. (eds) Advanced Microsystems for Automotive Applications 2001. VDI-Buch. Springer, Berlin, Heidelberg. [CrossRef]
- Hu, X.H.; Cao, L.; Luo, Y.; Chen, A.; Zhang, E.; Zhang, W. A Novel Methodology for Comprehensive Modeling of the Kinetic Behavior of Steerable Catheters. In IEEE/ASME Transactions on Mechatronics, August 2019. [CrossRef]
- Cummins, Available online: https://www.cummins.com/news/2024/03/18/virtual-sensors-and-their-role-energy-future (Accessed February, 6 2025).
- Bucaioni, A.; Pelliccione, P.; Mubeen, S. Modelling centralised automotive E/E software architectures, Adv. Eng. Inform. 2024, 59, 102289. [CrossRef]
- Zhang, Q.; Shen, S.; Li, H.; Cao, W.; Tang, W.; Jiang, J.; Deng, M.; Zhang, Y.; Gu, B.; Wu, K.; Zhang, K.; Liu, S. Digital twin-driven intelligent production line for automotive MEMS pressure sensors, Adv. Eng. Inform. 2022, 54, 101779. [CrossRef]
- Ida, N. Sensors, Actuators, and Their Interfaces: A multidisciplinary introduction, 2nd Ed. Publisher: The Institution of Engineering and Technology, 2020. [CrossRef]
- Masti, D.; Bernardini, D.; Bemporad, A. A machine-learning approach to synthesize virtual sensors for parameter-varying systems, Eur. J. Control. 2021, 61, 40-49. [CrossRef]
- Paepae, T.; Bokoro, P.N.; Kyamakya, K. From fully physical to virtual sensing for water quality assessment: A comprehensive review of the relevant state-of-the-art. Sensors 2021, 21(21), 6971. [CrossRef]
- Tihanyi, V.; Tettamanti, T.; Csonthó, M.; Eichberger, A.; Ficzere, D.; Gangel, K.; Hörmann, L.B.; Klaffenböck, M.A.; Knauder, C.; Luley, P.; et al. Motorway Measurement Campaign to Support R&D Activities in the Field of Automated Driving Technologies. Sensors 2021, 21(6), 2169. [CrossRef]
- Tactile Mobility. Available online: https://tactilemobility.com/ (Accessed February, 6 2025).
- Compredict-Virtual Sensor Platform. Available online: https://compredict.ai/virtual-sensor-platform/ (Accessed February, 6 2025).
- Mordor Intellingence. Available online: https://www.mordorintelligence.com/industry-reports/virtual-sensors-market (Accessed February, 6 2025).
- Iclodean, C.; Varga, B.O.; Cordoș, N. Autonomous Driving Technical Characteristics. In: Autonomous Vehicles for Public Transportation, Green Energy and Technology, Publisher: Springer, 2022, pp. 15-68. [CrossRef]
- SAE. Available online: https://www.sae.org/standards/content/j3016_202104/ (Accessed February, 6 2025).
- Muhovič, J.; Perš, J. Correcting Decalibration of Stereo Cameras in Self-Driving Vehicles. Sensors 2020, 20, 3241. [CrossRef]
- Hamidaoui, M.; Talhaoui, M.Z.; Li, M.; Midoun, M.A.; Haouassi, S.; Mekkaoui, D.E.; Smaili, A.; Cherraf, A.; Benyoub, F.Z. Survey of Autonomous Vehicles’ Collision Avoidance Algorithms. Sensors 2025, 25, 395. [CrossRef]
- Cabon, Y.; Murray, N.; Humenberger, M. Virtual KITTI 2. arXiv e-prints 2020, Art. no. arXiv:2001.10773. [CrossRef]
- Mallik, A.; Gaopande, M.L.; Singh, G.; Ravindran, A.; Iqbal, Z.; Chao, S.; Revalla, H.; Nagasamy, V. Real-time Detection and Avoidance of Obstacles in the Path of Autonomous Vehicles Using Monocular RGB Camera. SAE Int. J. Adv. Curr. Pract. Mobil. 2022, 5, 622–632. [CrossRef]
- Zhe, T.; Huang, L.; Wu, Q.; Zhang, J.; Pei, C.; Li, L. Inter-Vehicle Distance Estimation Method Based on Monocular Vision Using 3D Detection. IEEE Trans. Veh. Technol. 2020, 69, 4907–4919. doi.org/10.1109/tvt.2020.2977623.
- Rill, R.A.; Faragó, K.B. Collision avoidance using deep learning-based monocular vision. SN Comput. Sci. 2021, 2, 375. [CrossRef]
- He, J.; Tang, K.; He, J.; Shi, J. Effective vehicle-to-vehicle positioning method using monocular camera based on VLC. Opt. Express 2020, 28, 4433–4443. [CrossRef]
- Choi, W.Y.; Yang, J.H.; Chung, C.C. Data-Driven Object Vehicle Estimation by Radar Accuracy Modeling with Weighted Interpolation. Sensors 2021, 21, 2317. [CrossRef]
- Muckenhuber, S.; Museljic, E.; Stettinger, G. Performance evaluation of a state-of-the-art automotive radar and corres-ponding modeling approaches based on a large labeled dataset. J. Intell. Transport. S. 2022, 26, 655–674. [CrossRef]
- Sohail, M.; Khan, A.U.; Sandhu, M.; Shoukat, I.A.; Jafri, M.; Shin, H. Radar sensor based Machine Learning approach for precise vehicle position estimation. Sci. Rep. 2023, 13, 13837. [CrossRef]
- Srivastav, A.; Mandal, S. Radars for autonomous driving: A review of deep learning methods and challenges. IEEE Access 2023, 11, 97147–97168. [CrossRef]
- Poulose, A.; Baek, M.; Han, D.S. Point cloud map generation and localization for autonomous vehicles using 3D lidar scans. In Proceedings of the 2022 27th Asia Pacific Conference on Communications (APCC), Jeju, Republic of Korea, 19–21 October 2022; pp. 336–341. [CrossRef]
- Saha, A.; Dhara, B.C. 3D LiDAR-based obstacle detection and tracking for autonomous navigation in dynamic environments. Int. J. Intell. Robot. Appl. 2024, 8, 39–60. [CrossRef]
- Dazlee, N.M.A.A.; Khalil, S.A.; Rahman, S.A.; Mutalib, S. Object detection for autonomous vehicles with sensor-based technology using YOLO. Int. J. Intell. Syst. Appl. Eng. 2022, 10, 129–134. [CrossRef]
- Guan, L.; Chen, Y.; Wang, G.; Lei, X. Real-time vehicle detection framework based on the fusion of LiDAR and camera. Electronics 2020, 9, 451. [CrossRef]
- Kotur, M.; Lukić, N.; Krunić, M.; Lukač, Ž. Camera and LiDAR sensor fusion for 3d object tracking in a collision avoidance system. In Proceedings of the 2021 Zooming Innovation in Consumer Technologies Conference (ZINC), Novi Sad, Serbia, 26–27 May 2021; pp. 198–202. [CrossRef]
- Choi, W.Y.; Kang, C.M.; Lee, S.H.; Chung, C.C. Radar accuracy modeling and its application to object vehicle tracking. Int. J. Control. Autom. Syst. 2020, 18, 3146–3158. [CrossRef]
- Simcenter. Available online: https://blogs.sw.siemens.com/simcenter/the-sense-of-virtual-sensors/ (Accessed February, 6 2025).
- Kim, J.; Kim, Y.; Kum, D. Low-level sensor fusion network for 3D vehicle detection using radar range-azimuth heatmap and monocular image. In Proceedings of the Asian Conference on Computer Vision, Kyoto, Japan, 30 November–4 December 2020. [CrossRef]
- Lim, S.; Jung, J.; Lee, B.H.; Choi, J.; Kim, S.C. Radar sensor-based estimation of vehicle orientation for autonomous driving. IEEE Sensors J. 2022, 22, 21924–21932. [CrossRef]
- Caesar, H.; Bankiti, V.; Lang, A.H.; Vora, S.; Liong, V.E.; Xu, Q.; Krishnan, A.; Pan, Y.; Baldan, G.; Beijbom, O. nuScenes: A multimodal dataset for autonomous driving. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020, pp. 11621–11631. [CrossRef]
- Robsrud, D.N.; Øvsthus, Ø.; Muggerud, L.; Amendola, J.; Cenkeramaddi, L.R.; Tyapin, I.; Jha, A. Lidar-mmW Radar Fusion for Safer UGV Autonomous Navigation with Collision Avoidance. In Proceedings of the 2023 11th International Confe-rence on Control, Mechatronics and Automation (ICCMA), Grimstad, Norway, 1–3 November 2023; pp. 189–194. [CrossRef]
- Wang, Y.; Jiang, Z.; Gao, X.; Hwang, J.N.; Xing, G.; Liu, H. RODnet: Radar object detection using cross-modal supervision. In Proceedings of the IEEE/CVF Winter Conference on Applications of Computer Vision, Virtual, 5–9 January 2021; pp. 504–513. [CrossRef]
- Rövid, A.; Remeli, V.; Paufler, N.; Lengyel, H.; Zöldy, M.; Szalay, Z. Towards Reliable Multisensory Perception and Its Automotive Applications. Period. Polytech. Transp. Eng. 2020, 48(4), 334-340. [CrossRef]
- IPG, CarMaker. Available online: https://www.ipg-automotive.com/en/products-solutions/software/carmaker/ (Accessed February, 6 2025).
- Yeong, D.J.; Velasco-Hernandez, G.; Barry, J.; Walsh, J. Sensor and Sensor Fusion Technology in Autonomous Vehicles: A Review. Sensors 2021, 21, 2140. [CrossRef]
- Liu, X.; Baiocchi, O. A comparison of the definitions for smart sensors, smart objects and Things in IoT. In 2016 IEEE 7th Annual Information Technology, Electronics and Mobile Communication Conference (IEMCON), Vancouver, BC, 2016, pp. 1-4. [CrossRef]
- Peinado-Asensi, I.; Montés, N.; García, E. Virtual Sensor of Gravity Centres for Real-Time Condition Monitoring of an Industrial Stamping Press in the Automotive Industry. Sensors 2023, 23, 6569. [CrossRef]
- Stetter, R.; Witczak, M.; Pazera, M. Virtual Diagnostic Sensors Design for an Automated Guided Vehicle. Appl. Sci. 2018, 8, 702. [CrossRef]
- Lengyel, H.; Maral, S.; Kerebekov, S.; Szalay, Z.; Török, Á. Modelling and simulating automated vehicular functions in critical situations—application of a novel accident reconstruction concept. Vehicles 2023, 5(1), 266-285. [CrossRef]
- Dörr, D. Using Virtualization to Accelerate the Development of ADAS & Automated Driving Functions. IPG Automotive, GTC Europe München, 28 September 2017.
- Kim, J.; Park, S.; Kim, J.; Yoo, J. A Deep Reinforcement Learning Strategy for Surrounding Vehicles-Based Lane-Keeping Control. Sensors 2023, 23, 9843. [CrossRef]
- Pannagger, P.; Nilac, D.; Orucevic, F.; Eichberger, A.; Rogic, B. Advanced Lane Detection Model for the Virtual Development of Highly Automated Functions. arXiv:2104.07481, 2021. [CrossRef]
- IPG Guide-User’s Guide Version 12.0.1 CarMaker, IPG Automotive, 2023.
- Iclodean, C.; Varga, B.O.; Cordoș, N. Virtual Model. In: Autonomous Vehicles for Public Transportation, Green Energy and Technology, Publisher: Springer, 2022, pp. 195-335. [CrossRef]
- Schäferle, S. Choosing the correct sensor model for your application. IPG Automotive 2019. Available online: https://www.ipg-automotive.com/uploads/tx_pbfaqtickets/files/98/SensorModelLevels.pdf (Accessed February, 6 2025).
- Magosi, Z.F.; Wellershaus, C.; Tihanyi, V.R.; Luley, P.; Eichberger, A. Evaluation Methodology for Physical Radar Percep-tion Sensor Models Based on On-Road Measurements for the Testing and Validation of Automated Driving. Energies 2022, 15, 2545. [CrossRef]
- Reference Manual Version 12.0.1 CarMaker, IPG Automotive, 2023.
- Iclodean, C. Introducere în sistemele autovehiculelor, Publisher: Risoprint, Romania, 2023.
- Renard, D.; Saddem, R.; Annebicque, D.; Riera, B. From Sensors to Digital Twins toward an Iterative Approach for Existing Manufacturing Systems. Sensors 2024, 24, 1434. [CrossRef]
- Brucherseifer, E.; Winter, H.; Mentges, A.; Mühlhäuser, M.; Hellmann, M. Digital Twin conceptual framework for improving critical infrastructure resilience. at-Automatisierungstechnik 2021, 69(12), 1062-1080. [CrossRef]
- Grieves, M.; Vickers, J. Digital twin: Mitigating unpredictable, undesirable emergent behavior in complex systems. In Transdisciplinary perspectives on complex systems: New findings and approaches, Publisher: Springer, pp. 85-113. [CrossRef]
- Kritzinger, W.; Karner, M.; Traar, G.; Henjes, J.; Sihn, W. Digital Twin in manufacturing: A categorical literature review and classification. Ifac-PapersOnline 2018, 51(11), 1016-1022. [CrossRef]
- Shoukat, M.U.; Yan, L.; Yan, Y.; Zhang, F.; Zhai, Y.; Han, P.; Nawaz, S.A.; Raza, M.A.; Akbar, M.W.; Hussain, A. Autonomous driving test system under hybrid reality: The role of digital twin technology. Internet Things 2024, 27, 101301. [CrossRef]
- Iclodean, C.; Cordos, N.; Varga, B.O. Autonomous Shuttle Bus for Public Transportation: A Review. Energies 2020, 13, 2917. [CrossRef]
- Navya - Brochure-Autonom-Shuttle-Evo. Available online: https://navya.tech/wp-content/uploads/documents/Brochure-Autonom-Shuttle-Evo-EN.pdf (Accessed February, 6 2025).
- Navya - Self-Driving Shuttle for Passenger Transportation. Available online: https://www.navya.tech/en/solutions/moving-people/self-driving-shuttle-for-passenger-transportation/ (Accessed February, 6 2025).
- Patentimage. Available online: https://patentimages.storage.googleapis.com/12/0f/d1/33f8d2096f49f6/US20180095473A1.pdf (Accessed February, 6 2025).
- AVENUE Autonomous Vehicles to Evolve to a New Urban Experience Report. Available online: https://h2020-avenue.eu/wp-content/uploads/2023/03/Keolis-LyonH2020-AVENUE_Deliverable_7.6_V2-not-approved.pdf (Accessed February, 6 2025).
- EarthData Search. Available online: https://search.earthdata.nasa.gov/search?q=SRTM (Accessed February, 6 2025).
- GpsPrune. Available online: https://activityworkshop.net/software/gpsprune/download.html (Accessed February, 6 2025).
- InfoFile Description Version 12.0.1 IPGRoad, IPG Automotive, 2023.
- User Manual Version 12.0.1 IPGDriver, IPG Automotive, 2023.
- Piyabongkarn, D.N.; Rajamani, R.; Grogg, J.A.; Lew, J.Y. Development and Experimental Evaluation of a Slip Angle Estimator for Vehicle Stability Control. IEEE Trans. Control. Syst. Technol. 2009, 17, 78-88. [CrossRef]
- CarMaker Reference Manual 12.0.2 CarMaker, IPG Automotive, 2023.
- Pacejka, H.B. Tyre and Vehicle Dynamics. 2nd Edition. Publisher: Elsevier’s Science and Technology, 2006.
- Salminen, H. Parametrizing tyre wear using a brush tyre model. Master Thesis, Royal Institute of Technology, Stockholm, Sweden, 15 December 2014. https://kth.diva-portal.org/smash/get/diva2:802101/FULLTEXT01.pdf.
- Pacjka, H.B.; Besselink, I.J.M. Magic Formula Tyre Model with Transient Properties. Veh. Syst. Dyn. 1997, 27(sup001), 234–249. [CrossRef]
- Pacejka, H.B. Chapter 4 - Semi-Empirical Tire Models. In Tire and Vehicle Dynamics (Third Edition); Editor Pacejka, H.B.; Butterworth-Heinemann, 2012, pp. 149-209. [CrossRef]
- Guo, Q.; Xu, Z.; Wu, Q.; Duan, J. The Application of in-the-Loop Design Method for Controller. In 2nd IEEE Conference on Industrial Electronics and Applications, Harbin, China, 23-25 May 2007, pp. 78-81. [CrossRef]
- Chen, T.; Chen, L.; Xu, X.; Cai, Y.; Jiang, H.; Sun, X. Sideslip Angle Fusion Estimation Method of an Autonomous Electric Vehicle Based on Robust Cubature Kalman Filter with Redundant Measurement Information. World Electr. Veh. J. 2019, 10, 34. [CrossRef]
- Jin, L.; Xie, X.; Shen. C.; Wang, F.; Wang, F; Ji, S.; Guan, X.; Xu, J. Study on electronic stability program control strategy based on the fuzzy logical and genetic optimization method. Adv. Mech. Eng. 2017, 9(5), 1-13. [CrossRef]
- Zhao, Z.; Chen, H.; Yang, J.; Wu, X.; Yu, Z. Estimation of the vehicle speed in the driving mode for a hybrid electric car based on an unscented Kalman filter. Proc. Inst. Mech. Eng. Part D J. Automob. Eng. 2014, 229(4), 437-456. [CrossRef]
- Li, Q.; Chen, L.; Li, M.; Shaw, S.-L.; Nuchter, A. A Sensor-Fusion Drivable-Region and Lane-Detection System for Auto-nomous Vehicle Navigation in Challenging Road Scenarios. IEEE Trans. Veh. Technol. 2013, 63(2), 540-555. [CrossRef]
- Rana, M.M. Attack Resilient Wireless Sensor Networks for Smart Electric Vehicles. IEEE Sens. Lett. 2017, 1(2), 5500204. [CrossRef]
- Xia, X.; Xiong, L.; Huang, Y.; Lu, Y.; Gao, L.; Xu, N.; Yu, Z. Estimation on IMU yaw misalignment by fusing information of automotive onboard sensors. Mech. Syst. Signal Process. 2022, 162, 107993. [CrossRef]
- Sieberg, P.M.; Schramm, D. Ensuring the Reliability of Virtual Sensors Based on Artificial Intelligence within Vehicle Dynamics Control Systems. Sensors 2022, 22, 3513. [CrossRef]
- Xiong, L.; Xia, X.; Lu, Y.; Liu, W.; Gao, L.; Song, S.; Han, Y.; Yu, Z. IMU-Based Automated Vehicle Slip Angle and Attitude Estimation Aided by Vehicle Dynamics. Sensors 2019, 19, 1930. [CrossRef]
- Ess, A.; Schindler, K.; Leibe, B.; Van Gool, L. Object detection and tracking for autonomous navigation in dynamic environments. Int. J. Robot. Res. 2010, 29, 1707-1725. [CrossRef]
- Bewley, A.; Ge, Z.; Ott, L.; Ramos, F.; Upcroft, B. Simple online and realtime tracking. In 2016 IEEE International Conference on Image Processing (ICIP), Phoenix, AZ, USA, 5-28 September 2016. [CrossRef]
- Banerjee, S.; Serra, J.G.; Chopp, H.H.; Cossairt, O.; Katsaggelos, A.K. An Adaptive Video Acquisition Scheme for Object Tracking. In 27th European Signal Processing Conference (EUSIPCO), A Coruna, Spain, 02-06 September 2019. [CrossRef]
- Ning, C.; Menglu, L.; Hao, Y.; Xueping, S.; Yunhong, L. Survey of pedestrian detection with occlusion. Complex Intell. Syst. 2021, 7, 577–587. [CrossRef]
- Liu, Z.; Chen, W.; Wu, X. Salient region detection using high level feature. In 13th International Conference on Control Automation Robotics & Vision (ICARCV), Singapore, 10-12 December 2014. [CrossRef]
- Felzenszwalb, P.; Girshick, R.; McAllester, D.; Ramanan, D. Visual object detection with deformable part models. Commun. ACM. 2013, 56(9), 97-105. [CrossRef]
- Kato, S.; Takeuchi, E.; Ishiguro, Y.; Ninomiya, Y.; Takeda, K.; Hamada, T. An Open Approach to Autonomous Vehicles. IEEE Micro 2015, 35(6), 60-68. [CrossRef]
- Broggi, A.; Cattani, S.; Patander, M.; Sabbatelli, M.; Zani, P. A full- 3D voxel-based dynamic obstacle detection for urban scenario using stereo vision. In 16th International IEEE Conference on Intelligent Transportation Systems (ITSC 2013), The Hague, Netherlands, 06-09 October 2013, pp. 71–76. [CrossRef]
- Patra, S.; Maheshwari, P.; Yadav, S.; Arora, C.; Banerjee, S. A Joint 3D-2D based Method for Free Space Detection on Roads. arXiv:1711.02144 2018. [CrossRef]
- Vitor, G.B.; Lima, D.A.; Victorino, A.C.; Ferreira, J.V. A 2D/3D vision based approach applied to road detection in urban environments. In IEEE Intelligent Vehicles Symposium (IV 2013), Australia, Jun 2013, pp.952-957.
- Heinz, L. CarMaker Tips & Tricks No. 3-011 Detect Traffic Lights. IPG Automotive, 2019. Available online: https://www.ipg-automotive.com/uploads/tx_pbfaqtickets/files/100/DetectTrafficLights.pdf (Accessed February, 6 2025).
- Zhang, P.; Zhang, M.; Liu, J. Real-time HD map change detection for crowdsourcing update based on mid-to-high-end sensors. Sensors 2021, 21, 2477. [CrossRef]
- Bahlmann, C.; Zhu, Y.; Ramesh, V.; Pellkofer, M.; Koehler, T. A System for Traffic Sign Detection, Tracking, and Recognition Using Color, Shape, and Motion Information. In IEEE Proceedings of Intelligent Vehicles Symposium, Las Vegas, 6-8 June 2005, 255-260. [CrossRef]
- Fazekas, Z.; Gerencsér, L.; Gáspár, P. Detecting Change between Urban Road Environments along a Route Based on Static Road Object Occurrences. Appl. Sci. 2021, 11, 3666. [CrossRef]
- Liu, C.; Tao, Y.; Liang, J.; Li, K.; Chen, Y. Object detection based on YOLO network. In Proceedings of the 2018 IEEE 4th In-formation Technology and Mechatronics Engineering Conference (ITOEC), Chongqing, China, 14-16 December 2018, pp. 799-803. [CrossRef]
- Nuthong, C.; Charoenpong, T. Lane Detection using Smoothing Spline. In 3rd International Congress on Image and Signal Processing, Yantai, China, 16-18 October 2010, pp. 989-993. [CrossRef]
- Dou, J.; Li; J. Robust object detection based on deformable part model and improved scale invariant feature transform. Optik 2013, 124(24), 6485-6492. [CrossRef]
- Lindenmaier, L.; Aradi, S.; Bécsi, T.; Törő, O.; Gáspár, P. Object-Level Data-Driven Sensor Simulation for Automotive Environment Perception. IEEE Trans. Intell. Veh. 2023, 8(10), 4341-4356. [CrossRef]
- Bird, J.; Bird, J. Higher Engineering Mathematics, 5th edition; London, Routledge, 2006. [CrossRef]
- Ainsalu, J.; Arffman, V.; Bellone, M.; Ellner, M.; Haapamäki, T.; Haavisto, N.; Josefson, E.; Ismailogullari, A.; Lee, B.; Ma-dland, O.; et al. State of the Art of Automated Buses. Sustainability 2018, 10, 3118. [CrossRef]
- Lian, H.; Li, M.; Li, T.; Zhang, Y.; Shi, Y.; Fan, Y.; Yang, W.; Jiang, H.; Zhou, P.; Wu, H. Vehicle speed measurement method using monocular cameras. Sci. Rep. 2025, 15, 2755 https://doi.org/10.1038/s41598-025-87077-6.
- Vivacqua, R.; Vassallo, R.; Martins, F. A Low Cost Sensors Approach for Accurate Vehicle Localization and Autonomous Driving Application. Sensors 2017, 17, 2359. [CrossRef]
- Xue, L.; Li, M.; Fan, L.; Sun, A.; Gao, T. Monocular Vision Ranging and Camera Focal Length Calibration. Sci. Program. 2021, 2021, 979111. [CrossRef]
- Rosique, F.; Navarro, P.J.; Fernández, C.; Padilla, A. A Systematic Review of Perception System and Si-mulators for Autonomous Vehicles Research. Sensors 2019, 19, 648. [CrossRef]
- Elster, L.; Staab, J.P.; Peters, S. Making Automotive Radar Sensor Validation Measurements Comparable. Appl. Sci. 2023, 13, 11405. [CrossRef]
- Roy, C.J.; Balch, M.S. A Holistic Approach to Uncertainty Quantification with Application to Supersonic Nozzle Thrust. Int. J. Uncertain. Quantif. 2021, 2, 363-381. [CrossRef]
- Magosi, Z.F.; Eichberger, A. A Novel Approach for Simulation of Automotive Radar Sensors Designed for Systematic Support of Vehicle Development. Sensors 2023, 23, 3227. [CrossRef]
- Maier, M.; Makkapati, V. P.; Horn, M. Adapting Phong into a Simulation for Stimulation of Automotive Radar Sensors. In 2018 IEEE MTT-S International Conference on Microwaves for Intelligent Mobility (ICMIM), Munich, Germany, 15-17 April 2018, pp. 1-4. [CrossRef]
- Minin, I.V.; Minin, O.V. Lens Candidates to Antenna Array. In: Basic Principles of Fresnel Antenna Arrays. Lecture Notes Electrical Engineering, Springer, Berlin, Heidelberg, 2008; Volume 19, pp. 71–127. [CrossRef]
- Sensors Partners. Available online: LiDAR laser: what is LiDAR and how does it work? | Sensor Partners (Accessed March, 6 2025).
- García-Gómez, P.; Royo, S.; Rodrigo, N.; Casas, J.R. Geometric Model and Calibration Method for a Solid-State LiDAR. Sensors 2020, 20(10), 2898; https://doi.org/10.3390/s20102898.
- Kim, G. Performance Index for Extrinsic Calibration of LiDAR and Motion Sensor for Mapping and Localization. Sensors 2022, 22, 106. [CrossRef]
- Schmoll, L.; Kemper, H.; Hagenmüller, S.; Brown, C.L. Validation of an Ultrasonic Sensor Model for Application in a Simulation Platform. ATZelectronics worldwide 2024, 19(5), 8-13. https://link.springer.com/content/pdf/10.1007/s38314-024-1853-5.pdf.
- Sen, S.; Husom, E.J.; Goknil, A.; Tverdal, S.; Nguyen, P. Uncertainty-Aware Virtual Sensors for Cyber-Physical Systems. IEEE Software 2024, 41, 77–87. [CrossRef]
- Ying, Z.; Wang, Y.; He, Y.; Wang, J. Virtual Sensing Techniques for Nonlinear Dynamic Processes Using Weighted Proba-bility Dynamic Dual-Latent Variable Model and Its Industrial Applications. Knowl.-Based Syst. 2022, 235, 107642. [CrossRef]
- Yuan, X.; Rao, J.; Wang, Y.; Ye, L.; Wang, K. Virtual Sensor Modeling for Nonlinear Dynamic Processes Based on Local Weighted PSFA. IEEE Sens. J. 2022, 22, 20655–20664. [CrossRef]
- Zheng, T. Algorithmic Sensing: A Joint Sensing and Learning Perspective. In Proceedings of the 21st Annual International Conference on Mobile Systems, Applications and Services; Association for Computing Machinery: New York, NY, USA, June 18 2023, pp. 624–626. [CrossRef]
- Es-haghi, M.S.; Anitescu, C.; Rabczuk, T. Methods for Enabling Real-Time Analysis in Digital Twins: A Literature Review. Comput. Struct. 2024, 297, 107342. [CrossRef]












































| Level 1 | Level 2 | Level 3 | Level 4 | Level 5 (estimate) | |||||
|---|---|---|---|---|---|---|---|---|---|
| Model | Units | Model | Units | Model | Units | Model | Units | Model | Units |
| Ultrasonic | 4 | Ultrasonic | 8 | Ultrasonic | 8 | Ultrasonic | 8 | Ultrasonic | 10 |
| Radar Long Range | 1 | Radar Long Range | 1 | Radar Long Range | 2 | Radar Long Range | 2 | Radar Long Range | 2 |
| Radar Short Range | 2 | Radar Short Range | 4 | Radar Short Range | 4 | Radar Short Range | 4 | Radar Short Range | 4 |
| Camera mono | 1 | Camera mono | 4 | Camera mono | 2 | Camera mono | 3 | Camera mono | 3 |
| - | - | - | - | Camera stereo | 1 | Camera stereo | 1 | Camera stereo | 2 |
| - | - | - | - | Infra-Red | 1 | Infra-Red | 1 | Infra-Red | 2 |
| - | - | - | - | Lidar 2D/3D | 1 | Lidar 2D/3D | 4 | Lidar 2D/3D | 4 |
| - | - | - | - | Global Navigation | 1 | Global Navigation | 1 | Global Navigation | 1 |
| Total | 8 | Total | 17 | Total | 20 | Total | 24 | Total | 28 |
| 2012 | 2016 | 2018 | 2020 | estimated by 2030 | |||||
| Application | Model | Sensor | Description | Based |
|---|---|---|---|---|
| Vehicle dynamics | Ideal | Slip Angle | Information about the vehicle’s side slip angle | CPU |
| Ideal | Inertial | Information about inertial body movements | CPU | |
| ADAS | Ideal | Object | Detect objects defined as traffic objects | CPU |
| Ideal | Free Space | Detect free and occupied spaces between objects defined as traffic objects | CPU | |
| Ideal | Traffic Sign | Detect traffic signs along the road | CPU | |
| Ideal | Line | Detect other road markings | CPU | |
| Ideal | Road | Provides road information as digital data | CPU | |
| Ideal | Collision | Detect contacts of the vehicle with other traffic objects | CPU | |
| Ideal | Object-by-Line | Detect traffic objects moving along selected road lanes | CPU | |
| Hi-Fi | Camera | Detect objects defined as traffic objects, traffic signs and traffic lights | CPU | |
| Hi-Fi | Global Navigation | Simulate GPS (Global Positioning System) satellites and their visibility for the vehicles | CPU | |
| Hi-Fi | Radar | Detect objects defined as traffic objects based on the SNR (Signal-to-Noise Ratio) | CPU | |
| RSI | Ultrasonic RSI | Simulate the propagation of the sound pressure waves through the virtual environment | GPU | |
| RSI | Lidar RSI | Lidar sensor simulating the propagation of laser light pulses through the virtual environment | GPU |
| Function | LK | LDW | AD | SD | EM | FC | WLD | PT |
|---|---|---|---|---|---|---|---|---|
| Road curvature | ||||||||
| Longitudinal/lateral slope | ||||||||
| Deviation angle/distance | ||||||||
| Lane information | ||||||||
| Road point position | ||||||||
| Road marker attributes |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
