Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

A 256x256 LiDAR Imaging System Based on a 200mW SPAD-Based SoC with Micro-lens Array and Light-Weight RGB-guide Depth Completion Neural Network

Version 1 : Received: 22 June 2023 / Approved: 23 June 2023 / Online: 23 June 2023 (11:14:38 CEST)

A peer-reviewed article of this Preprint also exists.

Wang, J.; Li, J.; Wu, Y.; Yu, H.; Cui, L.; Sun, M.; Chiang, P.Y. A 256 × 256 LiDAR Imaging System Based on a 200 mW SPAD-Based SoC with Microlens Array and Lightweight RGB-Guided Depth Completion Neural Network. Sensors 2023, 23, 6927. Wang, J.; Li, J.; Wu, Y.; Yu, H.; Cui, L.; Sun, M.; Chiang, P.Y. A 256 × 256 LiDAR Imaging System Based on a 200 mW SPAD-Based SoC with Microlens Array and Lightweight RGB-Guided Depth Completion Neural Network. Sensors 2023, 23, 6927.

Abstract

Light Detection and Ranging (LiDAR) technology, a cutting-edge advancement in mobile applications, presents a myriad of compelling use cases, including enhancing low-light photography, capturing and sharing 3D images of fascinating objects, and elevating the overall augmented reality (AR) experience. However, its widespread adoption has been hindered by the prohibitive costs and substantial power consumption associated with its implementation. To surmount these obstacles, this paper proposes a low-power, low-cost, SPAD-based system-on-chip (SoC) which packages the microlens arrays (MLA) and incorporates with a light-weight RGB-guided sparse depth imaging completion neural network for 3D LiDAR imaging. The proposed SoC integrates an 8x8 Single-Photon Avalanche Detectors (SPADs) macro pixel array with time-to-digital converters (TDC) and charge pump, fabricated using a 180nm bipolar-CMOS-DMOS (BCD) process. A random MLA-based homogenizing diffuser efficiently transforms Gaussian beams into flat-topped beams with a 45° field of view (FOV), enabling flash projection at the transmitter. To further enhance resolution and broaden application possibilities, a lightweight neural network employing RGB-guided sparse depth complementation is proposed, enabling a substantial expansion of image resolution from 8x8 to quarter video graphics array level (QVGA; 256x256). Experimental results demonstrate the effectiveness and stability of the hardware encompassing the SoC and optical system, as well as the lightweight features and accuracy of the algorithmic neural network. This integrated state-of-the-art hardware-software solution offers a promising and inspiring foundation for developing consumer-level 3D imaging applications.

Keywords

LiDAR; 3D imaging; System on chip; Microlens array; Neural network; RGB-guided; Depth completion

Subject

Computer Science and Mathematics, Hardware and Architecture

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.