Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Guided Direct Time-of-Flight Lidar using Stereo Cameras for Enhanced Laser Power Efficiency

Version 1 : Received: 2 October 2023 / Approved: 3 October 2023 / Online: 3 October 2023 (10:13:45 CEST)

A peer-reviewed article of this Preprint also exists.

Taneski, F.; Gyongy, I.; Al Abbas, T.; Henderson, R.K. Guided Direct Time-of-Flight Lidar Using Stereo Cameras for Enhanced Laser Power Efficiency. Sensors 2023, 23, 8943. Taneski, F.; Gyongy, I.; Al Abbas, T.; Henderson, R.K. Guided Direct Time-of-Flight Lidar Using Stereo Cameras for Enhanced Laser Power Efficiency. Sensors 2023, 23, 8943.

Abstract

Self-driving vehicles demand efficient and reliable depth sensing technologies. Lidar, with its capacity for long-distance, high-precision measurement, is a crucial component in this pursuit. However, conventional mechanical scanning implementations suffer from reliability, cost, and frame rate limitations. Solid-state lidar solutions have emerged as a promising alternative, but the vast amount of photon data processed and stored using conventional direct time-of-flight (dToF) prevents long-distance sensing unless power-intensive partial histogram approaches are used. In this paper, we introduce a groundbreaking ‘guided’ dToF approach, harnessing external guidance from other onboard sensors to narrow down the depth search space for a power and data-efficient solution. This approach centers around a dToF sensor in-which the exposed time widow of independent pixels can be dynamically adjusted. We utilize a 64-by-32 macropixel dToF sensor and a pair of vision cameras to provide the guiding depth estimates. Our demonstrator captures a dynamic outdoor scene at 3 fps with distances up 75 m. Compared to a conventional full histogram approach, on-chip data is reduced by over 25 times, while the total laser cycles in each frame are reduced by at least 6 times compared to any partial histogram approach. The capability of guided dToF to mitigate multipath reflections is also demonstrated. For self-driving vehicles where a wealth of sensor data is already available, guided dToF opens new possibilities for efficient solid-state lidar.

Keywords

lidar; direct time-of-flight; dToF; flash lidar; SPADs; stereo depth; 3D vision

Subject

Engineering, Electrical and Electronic Engineering

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.