Article
Version 2
This version is not peer-reviewed
CuFusion2: Accurate and Denoised Volumetric 3D Object Reconstruction Using Depth Cameras
Version 1
: Received: 12 December 2018 / Approved: 13 December 2018 / Online: 13 December 2018 (09:41:36 CET)
Version 2 : Received: 8 April 2019 / Approved: 9 April 2019 / Online: 9 April 2019 (12:24:34 CEST)
Version 2 : Received: 8 April 2019 / Approved: 9 April 2019 / Online: 9 April 2019 (12:24:34 CEST)
How to cite: ZHANG, C. CuFusion2: Accurate and Denoised Volumetric 3D Object Reconstruction Using Depth Cameras. Preprints 2018, 2018120165 ZHANG, C. CuFusion2: Accurate and Denoised Volumetric 3D Object Reconstruction Using Depth Cameras. Preprints 2018, 2018120165
Abstract
3D object reconstruction from depth image streams using Kinect-style depth cameras has been extensively studied. In this paper, we propose an approach for accurate camera tracking and volumetric dense surface reconstruction assuming a known cuboid reference object is present in the scene. Our contribu¬tion is three-fold. (a) We maintain drift-free camera pose tracking by incorporating the 3D geometric constraints of the cuboid reference object into the image registration process. (b) We reformulate the problem of depth stream fusion as a binary classification problem, enabling high-fidelity surface reconstruction, especially in the con¬cave zones of objects. (c) We further present a surface denoising strategy to mitigate the topological inconsistency (e.g., holes and dangling triangles), which facilitates the generation of a noise-free triangle mesh. We extend our public dataset CU3D with several new image sequences, test our algorithm on these sequences and quantitatively compare them with other state-of-the-art algorithms. Both our dataset and our algorithm are available as open-source content at https://github.com/zhangxaochen/CuFusion for oth-er researchers to reproduce and verify our results.
Keywords
3D object reconstruction, depth cameras, Kinect sensors; open source, signal denoising, SLAM
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)