Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

IndustNerf: Accurate 3D Industrial Digital Twin based on Integrating Neural Radiance Fields using Unsupervised Learning

Version 1 : Received: 11 May 2024 / Approved: 13 May 2024 / Online: 13 May 2024 (16:24:46 CEST)

How to cite: Zhou, H.; Xu, J.; Lin, H.; Nie, Z.; Zheng, L. IndustNerf: Accurate 3D Industrial Digital Twin based on Integrating Neural Radiance Fields using Unsupervised Learning. Preprints 2024, 2024050879. https://doi.org/10.20944/preprints202405.0879.v1 Zhou, H.; Xu, J.; Lin, H.; Nie, Z.; Zheng, L. IndustNerf: Accurate 3D Industrial Digital Twin based on Integrating Neural Radiance Fields using Unsupervised Learning. Preprints 2024, 2024050879. https://doi.org/10.20944/preprints202405.0879.v1

Abstract

Digital twin technology is revolutionizing traditional manufacturing paradigms. In modern manufacturing systems, digital twin technology is fraught with challenges due to the scarcity of labeled data. Specifically, existing supervised machine learning algorithms, with their reliance on voluminous training data, find their applicability constrained in real-world production settings. This paper introduces an unsupervised 3D reconstruction approach tailored for industrial applications, aimed at bridging the data void in creating digital twin models. Our proposed model, by ingesting high-resolution 2D images, autonomously reconstructs precise 3D digital twin models without the need for manual annotations or prior knowledge. Through comparisons with multiple baseline models, we demonstrate the superiority of our method in terms of accuracy, speed, and generalization capabilities. This research not only offers an efficient approach to industrial 3D reconstruction but also paves the way for the widespread adoption of digital twin technology in manufacturing.

Keywords

industrial digital twin, neural radiance fields, unsupervised learning, 3D reconstruction

Subject

Engineering, Mechanical Engineering

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.