Evaluation and Testing Platform for Automotive LiDAR Sensors

: The world is facing a great technological transformation towards full autonomous 1 vehicles, where optimists predict that by 2030, autonomous vehicles will be sufﬁciently reliable, 2 affordable and common to displace most human driving. To cope with these trends, reliable 3 perception systems must enable vehicles to hear and see all the surroundings, being light detection 4 and ranging (LiDAR) sensors a key instrument for recreating a 3D visualization of the world in real 5 time. However, perception systems must rely in accurate measurements of the environment. Thus, 6 sensors must be calibrated and benchmarked before being placed on the market or assembled in 7 a car. This article presents an Evaluation and Testing Platform for Automotive LiDAR sensors 8 with the main goal of testing not only commercially available sensors, but also sensor prototypes 9 currently under development in Bosch Automotive Electronics division. The testing system 10 can benchmark any LiDAR sensor under different conditions, recreating the expected driving 11 environment to which such devices are normally subjected. To characterize and validate the sensor 12 under test, the platform evaluates several parameters such as the ﬁeld of view (FoV), angular 13 resolution, sensor’s range, etc. This project results from a partnership between the University of 14 Minho and Bosch Car Multimedia Portugal, S.A. 15

with these revolutionary trends, new solutions at the sensor level must be created to 34 enable vehicles the ability to hear and see the surrounding environment. An autonomous 35 vehicle requires reliable sensors to recreate an accurate mapping of the surroundings, 36 which is only possible with multi-sensor perception systems relying on a combination of 37 Radar, Cameras, and light detection and ranging (LiDAR) sensors [7][8][9][10], as illustrated LiDAR sensors are emerging as the state-of-the-art technology that must be manda-44 tory on a perception system, since it enables a true 3D visualization of the surroundings 45 through a point cloud representation in real time [11][12][13]. Accurate and precise measure-46 ments of the surroundings with a LiDAR can assist the perception systems in several 47 tasks [9], e.g., obstacles, objects, and vehicles detection [14][15][16]; pedestrians recognition 48 and tracking [17,18]; ground segmentation for road filtering [19]; among others [20]. 49 The advances around LiDAR keep improving its measuring and imaging architectures 50 [12,21,22]. Nonetheless, the measurements and the 3D point cloud of a LiDAR sen-51 sor can always be corrupted by several noise sources, e.g., internal components [23], 52 mutual interference [24,25], reflectivity issues [26], light [11], adverse weather condi-53 tions [10,27,28], and others [29], making compulsory to test and analyze all sensor's 54 characteristics before being placed on the market or assembled in a car.   In a high-level overview, a LiDAR system is composed of two main components, 77 a light Emitter (laser) and a Receiver (light detector), as depicted in Figure 2. The 78 laser emits short light pulses with a well-defined time interval (few to several hundred 79 nanoseconds), and with specific spectral properties into the optical steering system. By 80 regulating mirror's angles, the system controls the direction of the light vertically and 81 horizontally, providing multiple angle detection with just a single beam. Additionally, 82 the optical properties of the beam can be changed by the lens system in order to achieve 83 better performance ratios, e.g., with signal modulation schemes [22,30]. After reflecting 84 into an object, the signal is reflected back to the sensor and the receiver collects the 85 photons and it is followed by a system that, depending on the application, filters and 86 selects specific wavelengths or polarization. The receiver system is also responsible to 87 convert the optical signal into an electrical one and its intensity stored in a computing The Field of View is one of the metrics that particularly defines the maximum angle 96 a LiDAR sensor is able to detect objects, as shown in Figure 3.    can be particularly challenging due to solar radiation being a powerful light source 112 present in a wide range of wavelengths [31]. Therefore, it is important to evaluate 113 the sensor's output when exposed to background light in a controlled environment.

114
• The Power Consumption test aims at monitoring and analyzing the power con- where A is a constant, R lab is the target's reflectivity, and r 2 lab is the target's distance.
If the required minimum level for the returning signal remains the same regardless 131 the target's reflectivity, the maximum distance (for any reflectivity value) can be 132 calculated with the Equation 2, where R sim is the target reflectivity to be simulated, 133 and r sim is the corresponding target distance calculated for the new reflectivity level.

134
To reduce errors in the estimations, several measurements for the maximum range 135 must be done, e.g., targets with reflectivities of 10%, 20%, and 40%.     190 Prior to its utilization, the rail system was calibrated with a rangefinder equipment that          (c) Distance filter and euclidean clustering applied and tuned. Figure 9. Target detection steps.

LiDAR Evaluation and Testing
points that actually belong to the target). The result from applying the distance filter is 285 shown in Figure 9a. This procedure not only removes undesired points, but also helps in 286 reducing the computational costs of the subsequent tasks.

Implementation of the FoV test 329
The test to determine the sensor's FoV consists of using a target with a well-known 330 size and reflectivity, placed at a know distance on top of the rail system target's holder.

331
Since the rail system can only provide variable ranges, we can take advantage of the it is possible to determinate the sensor's FoV. This procedure is illustrated in Figure 10. The test starts with a routine that uses the services provided by the Target Detection 336 ROS package (previously explained) to find a target inside the sensor's point cloud data.

337
If the target is detected, the algorithm starts measuring the FoV.
In all procedures, to get the maximum and minimum detection angles, the system  with a known size (T width x T height ) (Figure 12), placed at a known distance (T dist ), and 364 later, converting the value to angular resolution using Equations 6.

385
To validate the evaluation and testing platform and the algorithms developed  No node detected in the ROS Environment 3 Component not ready 4 Component has internal errors 5 Component not moving after a moving command 6 Component not stopping after a stop command 7 Component not in the correct position    Figure 14 depicts all the performed steps in order to detect and lock the target in the 446 point cloud: (1) Figure 14a shows the raw data sent by the VLS-128; (2) Figure 14b 447 depicts the FoVSF being applied; (3) Figure 14c illustrates the DF output; and (4) Figure   448 14d shows only the point cluster that corresponds to the target visible and locked in the 449 point cloud. Hereafter, we run the Tests package, which is responsible to perform the algorithms 451 previously described in Figures 10 and 11. The gathered results are summarized in Table   452 3, which were obtained using the parameters described in Table 2.  In all results we could see the proper operation of the evaluation and testing 459 platform, where some calculated angles have slight deviations from the desired values.

460
It is important to mention that our measurements are being performed from a sensor's 461 receiver perspective, which are only based on the point cloud data provided by the 462 sensor and which are going to be used by other (high-level) applications within the 463 perception system of the car. Therefore, we consider that at this order of magnitude, 464 these deviations are not critical and can still validate the sensors parameters being tested.

465
When a more accurate analysis is required, we can also submit the sensor to an end-