Taneski, F.; Gyongy, I.; Al Abbas, T.; Henderson, R.K. Guided Direct Time-of-Flight Lidar Using Stereo Cameras for Enhanced Laser Power Efficiency. Sensors2023, 23, 8943.
Taneski, F.; Gyongy, I.; Al Abbas, T.; Henderson, R.K. Guided Direct Time-of-Flight Lidar Using Stereo Cameras for Enhanced Laser Power Efficiency. Sensors 2023, 23, 8943.
Taneski, F.; Gyongy, I.; Al Abbas, T.; Henderson, R.K. Guided Direct Time-of-Flight Lidar Using Stereo Cameras for Enhanced Laser Power Efficiency. Sensors2023, 23, 8943.
Taneski, F.; Gyongy, I.; Al Abbas, T.; Henderson, R.K. Guided Direct Time-of-Flight Lidar Using Stereo Cameras for Enhanced Laser Power Efficiency. Sensors 2023, 23, 8943.
Abstract
Self-driving vehicles demand efficient and reliable depth sensing technologies. Lidar, with its capacity for long-distance, high-precision measurement, is a crucial component in this pursuit. However, conventional mechanical scanning implementations suffer from reliability, cost, and frame rate limitations. Solid-state lidar solutions have emerged as a promising alternative, but the vast amount of photon data processed and stored using conventional direct time-of-flight (dToF) prevents long-distance sensing unless power-intensive partial histogram approaches are used. In this paper, we introduce a groundbreaking ‘guided’ dToF approach, harnessing external guidance from other onboard sensors to narrow down the depth search space for a power and data-efficient solution. This approach centers around a dToF sensor in-which the exposed time widow of independent pixels can be dynamically adjusted. We utilize a 64-by-32 macropixel dToF sensor and a pair of vision cameras to provide the guiding depth estimates. Our demonstrator captures a dynamic outdoor scene at 3 fps with distances up 75 m. Compared to a conventional full histogram approach, on-chip data is reduced by over 25 times, while the total laser cycles in each frame are reduced by at least 6 times compared to any partial histogram approach. The capability of guided dToF to mitigate multipath reflections is also demonstrated. For self-driving vehicles where a wealth of sensor data is already available, guided dToF opens new possibilities for efficient solid-state lidar.
Keywords
lidar; direct time-of-flight; dToF; flash lidar; SPADs; stereo depth; 3D vision
Subject
Engineering, Electrical and Electronic Engineering
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.