PreprintArticleVersion 1Preserved in Portico This version is not peer-reviewed
Visualisation of Fossilised Tree Trunks Using Immersive 3D-360 Audio-Visual and Geospatial Digitisation Techniques in XR Derived from UAS and Terrestrial Data, Aided by Machine Learning Computational Photography
Version 1
: Received: 25 August 2023 / Approved: 28 August 2023 / Online: 29 August 2023 (09:25:47 CEST)
How to cite:
Psarros, C.; Zouros, N.; Soulakellis, N. Visualisation of Fossilised Tree Trunks Using Immersive 3D-360 Audio-Visual and Geospatial Digitisation Techniques in XR Derived from UAS and Terrestrial Data, Aided by Machine Learning Computational Photography. Preprints2023, 2023081922. https://doi.org/10.20944/preprints202308.1922.v1
Psarros, C.; Zouros, N.; Soulakellis, N. Visualisation of Fossilised Tree Trunks Using Immersive 3D-360 Audio-Visual and Geospatial Digitisation Techniques in XR Derived from UAS and Terrestrial Data, Aided by Machine Learning Computational Photography. Preprints 2023, 2023081922. https://doi.org/10.20944/preprints202308.1922.v1
Psarros, C.; Zouros, N.; Soulakellis, N. Visualisation of Fossilised Tree Trunks Using Immersive 3D-360 Audio-Visual and Geospatial Digitisation Techniques in XR Derived from UAS and Terrestrial Data, Aided by Machine Learning Computational Photography. Preprints2023, 2023081922. https://doi.org/10.20944/preprints202308.1922.v1
APA Style
Psarros, C., Zouros, N., & Soulakellis, N. (2023). Visualisation of Fossilised Tree Trunks Using Immersive 3D-360 Audio-Visual and Geospatial Digitisation Techniques in XR Derived from UAS and Terrestrial Data, Aided by Machine Learning Computational Photography. Preprints. https://doi.org/10.20944/preprints202308.1922.v1
Chicago/Turabian Style
Psarros, C., Nikolaos Zouros and Nikolaos Soulakellis. 2023 "Visualisation of Fossilised Tree Trunks Using Immersive 3D-360 Audio-Visual and Geospatial Digitisation Techniques in XR Derived from UAS and Terrestrial Data, Aided by Machine Learning Computational Photography" Preprints. https://doi.org/10.20944/preprints202308.1922.v1
Abstract
The aim of this research is to investigate and use a variety of immersive multisensory media techniques in order to create convincing digital models of fossilised tree trunks for use in XR. This is made possible via the use of geospatial data derived from sources such as aerial imaging using UAS, terrestrial material using cameras and also include both the visual and audio element for better immersion, accessible and explorable in 6 Degrees of Freedom (6DoF). Immersiveness is a key factor in order to result in output that is especially engaging to the user. Both conventional and alternative methods are explored and compared, emphasising in the advantages made possible via the help of Machine Learning Computational Photography. Material is collected using both UAS and terrestrial camera devices, including a 3D-360º camera with 6 sensors, using stitched panoramas as sources for photogrammetry processing. Difficulties such as capturing large free standing objects using terrestrial means were overcome using practical solutions involving mounts and remote streaming solutions. Conclusions indicated that superior fidelity can be achieved by the help of Machine Learning Computational Photography processes and higher resolutions and technical specs of equipment not necessarily translating to superior outputs.
Keywords
XR; UAS; Geovisualisation; Computational Photography; Geopark; Petrified tree
Subject
Environmental and Earth Sciences, Remote Sensing
Copyright:
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.