Preprint
Article

This version is not peer-reviewed.

Design of Digital Events in the Metaverse, Using Digital Twins

Submitted:

31 March 2025

Posted:

01 April 2025

You are already at the latest version

Abstract
This article explores the possibilities offered by the convergence of virtual reality, the metaverse, digital twins, and the creation of virtual events, highlighting their impact on digital transformation and social interaction. The role of virtual reality as an immersive technology that enables multisensory experiences and its integration into persistent virtual environments, such as the metaverse, is analyzed. Likewise, the concept of digital twins is examined as precise virtual representations of real-world objects, processes, or systems, and their application in simulation and optimization. In the context of virtual events, it addresses how these technologies facilitate the creation of interactive, personalized, and globally accessible spaces. A digital twin of the Alcoy campus has been created, and the design for the 50th anniversary event of the merger of the Alcoy Industrial School with the Polytechnic University of Valencia has been integrated. The virtual space was tested with students and faculty staff from the campus to evaluate the effect of its applications and assess the need for improvements. The article concludes by highlighting the opportunities and challenges presented by the implementation of these innovations, as well as their potential to redefine the way people work, learn, and connect in digital environments. This particular case allows us to offer a comprehensive view of emerging technologies and their role in shaping the digital future.
Keywords: 
;  ;  ;  ;  ;  ;  
Subject: 
Engineering  -   Other

1. Introduction

Technological evolution has given rise to an innovative digital ecosystem, where concepts such as virtual reality (VR), the metaverse (MV), digital twins (DT), and the creation of virtual events are profoundly transforming various industrial and social sectors. This article explores the intersection of these technologies, highlighting their applications, benefits, and challenges. VR enables immersive experiences that enhance learning, entertainment, and remote collaboration. For its part, the MV is emerging as a shared digital space, where human interactions are amplified through persistent three-dimensional environments. DT, virtual representations of physical objects or systems, optimize processes and facilitate data-driven decision-making in real time. Finally, the creation of virtual events opens new possibilities in the world of marketing, education, and remote work, offering attractive and globally accessible experiences.
The adoption and development of these technologies have gained unprecedented relevance in contemporary society, redefining the way people interact, learn, and work with their environment. Simultaneously, as these tools evolve, their implications for the economy, culture, and communication become more significant. To better understand the impact of these technologies, it is necessary to analyze their historical background and their development over time. In the case of VR, it evolves from the 1960s towards modern applications in video games, training simulations, and contemporary collaborative environments. Similarly, the concept of the MV, influenced by science fiction literature and the evolution of digital platforms, has managed to create an immersive ecosystem with virtual economies and advanced interaction models. In areas such as tourism [1], hospitality [2], fashion [3], or virtual museums [4], it has been implemented and is providing an increase in revenue for companies.
On the other hand, virtual events gained special relevance after the COVID-19 pandemic, a time when companies, educational institutions, and conference organizers adopted digital solutions to maintain connectivity on a global scale. Technologies such as live streaming, interactive 3D spaces, and the integration of artificial intelligence have enhanced the user experience in virtual environments, allowing for open participation without geographical barriers. In the case of DT, their application in Industry 4.0 is improving efficiency in manufacturing, predictive maintenance, and the optimization of complex systems, through the use of virtual models that faithfully replicate their physical counterparts [5,8]. These digital representations are proving to be very effective in sectors such as healthcare, engineering, and urban infrastructure management, where simulations and predictive analysis can reduce costs and improve decision-making [9,12].
In the field of education, DT are used to create experimental environments in the fields of science, engineering, and medicine, avoiding risks [13,14]. Acting then as virtual models that predict student performance and propose adapting academic content [15,16]; as replicas of machines and industrial processes, training in realistic environments within practical experimentation laboratories [17,18], and operating as recreation of historical environments in 3D for interactive exploration [19,20]. Finally, DT play a key role in optimizing performance and operational efficiency across various industries. They are being applied in various fields, with notable use in the healthcare sector [21] to optimize hospital processes, forecast service demand, and manage resources. In this, they facilitate the planning and design of hospital infrastructures, improving the construction and renovation of facilities. Moreover, they allow for the monitoring of patients’ conditions and the personalization of treatments. Its implementation in medical training through interactive simulations reduces errors in surgeries and emergencies. Institutions such as Mayo Clinic, Cleveland Clinic, Johns Hopkins Medicine, and Partners HealthCare have integrated this technology into the optimization of their services. For example, the digital twin from Philips Healthcare functions as an accurate virtual replica of an organ or biological system, created from data obtained from advanced medical imaging (MRI, CT, ultrasound), genetic tests, biometric sensors, and medical records. Its application allows healthcare professionals to simulate and predict in order to determine more precise and faster treatments. Additionally, it streamlines the predictive analysis of the patient’s response to various procedures and therapies, improving medical planning and reducing clinical risks. The case of the Living Heart Project by Dassault Systèmes is an advanced application of digital twins in cardiology, creating highly detailed virtual models of the human heart to simulate and optimize treatments, evaluate the efficiency of medical devices, and plan surgeries without direct intervention in patients. Its implementation is optimizing the understanding and approach to heart diseases, providing precision tools for medical practice and innovation in cardiovascular health.
DT are becoming fundamental in the transportation industry [22,23]. In the railway sector, Deutsche Bahn has implemented this technology to optimize the operation, maintenance, and management of its railway network, improving operational efficiency, safety, infrastructure planning, and transportation sustainability. In aviation, Boeing uses them to simulate aircraft behavior, optimize design, improve safety, operational efficiency, and reduce costs [24]. Its application in the 787 Dreamliner series stands out, where this technology has revolutionized the design, maintenance, and operation of aircraft. All of this promotes a more efficient and sustainable aviation, and its impact is expected to grow in the aerospace industry. In the aerospace sector, Rolls-Royce has implemented this technology for real-time monitoring of engine performance, such as the Trent XWB used in Airbus A350 aircraft, optimizing its lifecycle and predictive maintenance [25]. In the automotive sphere, BMW has integrated DT in its Regensburg plant (See Figure 1), allowing for the simulation of assembly to reduce production times [26]. Mercedes-Benz applies this technology in its Sindelfingen factory through Internet of Things (IoT), AI, and Big Data, improving efficiency and sustainability, especially in the production of electric vehicles like the EQS. Ford optimizes the manufacturing of batteries and electrical systems for models like the Mustang Mach-E and the F-150 Lightning, ensuring quality and efficiency [27].
For its part, the Tesla brand has developed a digital twin that virtually replicates each vehicle in real time, allowing for performance monitoring, predictive maintenance, and remote software updates. Turning this technology into the key to the evolution of its autonomous driving system, which optimizes Autopilot’s performance through advanced simulations [28,30]. Siemens has implemented a digital twin in its Amberg factory in Germany, optimizing the production of industrial control systems through advanced automation, which improves efficiency and reduces costs. General Electric applies this technology in multiple sectors, from manufacturing to energy and healthcare. In renewable energy, DT enable optimal monitoring of wind turbine performance, analyzing data in real-time to predict failures and improve operational efficiency. Unilever has integrated a digital twin into its production and supply chain, highlighting its application at the Hamburg plant. This implementation has improved operational efficiency, reduced costs, and promoted sustainability through energy optimization and waste reduction [31,32].
In the field of urban planning, DT are applied in the simulation of urban infrastructures to monitor and improve mobility, traffic, and energy efficiency, contributing to the development of sustainable cities [33]. The city of Singapore has developed a digital twin as part of its Smart Nation (SN) strategy, using advanced technologies to improve in these areas and thereby enhance the quality of life [34]. In Dubai, through its Smart Dubai initiative, a digital twin based on IoT, big data, and artificial intelligence has been implemented, allowing for the simulation and optimization of infrastructures, achieving the same advantages offered by the previous SN strategy. Shenzhen has developed a detailed virtual replica of the city, integrating real-time data through IoT sensors and monitoring systems to improve traffic, energy, public services, and security management, establishing itself as a benchmark in smart cities [35].
The MV has transformed event organization, highlighting its impact on museums and art galleries. Sotheby’s Virtual Gallery is a digital platform that employs advanced technology for the exhibition and commercialization of art, antiques, and collectibles in an immersive virtual environment. This gallery uses high-resolution 3D renderings to represent works with great precision, allowing users to explore pieces in three-dimensional mode and, in some cases, through VR. Its catalog includes modern and contemporary artworks, works by masters of antiquity, non-fungible tokens (NFTs), jewelry, and other historical objects, globally accessible through the internet. Users can interact with the works, obtain detailed information, and participate in acquisitions directly or in online auctions. Sotheby’s also organizes temporary exhibitions and exclusive events, consolidating its role in the digitalization of the art market and democratizing access through technology. One of the most prestigious contemporary art fairs in America is Art Basel Miami Beach, which has recently incorporated virtual editions to respond to recent global challenges. Already in 2020, it launched the Online Viewing Rooms, allowing the digital exhibition of works, with the participation of 282 galleries from 35 countries and more than 230,000 visitors. Subsequently, in December, it presented OVR: Miami Beach, with 255 galleries from 30 countries and nearly 2,500 works of art, integrating new features such as interactive videos and digital exploration tools. These initiatives consolidated a hybrid model of art dissemination and commerce, complementing in-person fairs with digital platforms [36,37].
In the musical realm, the MV has revolutionized the concert experience through interactive events in video games [38,39]. The Fortnite Concert Series have redefined the connection between artists and audiences through immersive shows and dynamic virtual environments. The first major event, with Marshmello in 2019, gathered more than 10 million players, followed in the ranking by Travis Scott (2020) with 27 million attendees and Ariana Grande (2021) with interactive stages. Other artists like J Balvin, Eminem, and The Kid Laroi have participated, thereby consolidating this format. Likewise, Roblox Virtual Concerts have expanded the music industry in digital environments, offering interactive experiences with missions and collectible items within the game. Who started this trend in 2020 was Lil Nas X(See Figure 2), with 33 million views, followed by Twenty One Pilots, Zara Larsson, and KSI in 2021. A few years later, David Guetta, The Chainsmokers, and Charli XCX solidified their concerts as a new form of artistic expression and massive interaction in virtual spaces [40,41].
Sandbox Game Jams are video game development events where programmers, designers, and creatives collaborate to come up with games within a limited time frame (48-72 hours) on open development platforms like Roblox, Minecraft, The Sandbox, and Garry’s Mod. Their goal is to encourage experimentation and innovation in game design, allowing participants to explore new mechanics and more creative and unrestricted concepts. These events usually include specific themes and promote interdisciplinary teamwork. A recent example is the Discovery Game Jam, announced by The Sandbox on March 14, 2025, in their GMAE Show program, with the purpose of continuing to drive creativity in the MV ecosystem [42]. The Metaverse Summit is an international conference that brings together experts, companies, and developers to analyze the future of the MV. It addresses key topics such as VR, augmented reality (AR), blockchain, NFTs, and AI, exploring technological trends and opportunities. It has been held in cities like Paris and New York, attracting thousands of participants and establishing itself as an essential event for the digital industry [43,44]. In the realm of digital assets, NFT.NYC is a prominent annual conference in the NFT sector, bringing together artists, collectors, and industry leaders to discuss advancements in digital art, blockchain, and commercial applications. The 2025 edition of NFT.NYC will be held from June 25 to 27 in Times Square, New York, addressing topics such as art, AI, video games, legal aspects, and the digital community. Some previous editions have featured the participation of influential figures from companies like T-Mobile, Mastercard, and Lacoste, demonstrating the growing impact of NFTs across various industries [45].
One of the most dynamic virtual social interaction platforms that organizes festivals and events to enhance the user experience is Zepeto World Festival. Among the highlighted events are the Zepetomoji Festival, held at Zepeto University, and the Zepeto School Festival 2024, which included contests, challenges, avatar item giveaways, and live broadcasts with influencers. These festivals reflect Zepeto’s commitment to creating immersive experiences for its global community [46]. New Horizons, released in 2020 by Animal Crossing, is beginning to be considered a metaverse due to its shared virtual world and social interaction capabilities. This is because, during the pandemic, it allowed players to create customized islands, promoting social connection. It is not a complete MV, although it shares certain characteristics, due to the lack of integration with multiple platforms and digital experiences[47,48]. In Highrise: Virtual Metaverse, events are fundamental to the user experience, where they can participate in activities such as fashion contests, debates, karaoke nights, and competitions in labyrinthine spaces. These events, organized both by the developers and the users themselves, stimulate rich and creative social interaction within the platform [49,50]. Meta has promoted virtual events through Horizon World Events, a VR platform where users explore, interact, and create in a shared environment. Among the highlighted events are educational marathons, immersive concerts (such as those by Sabrina Carpenter and The Kid LAROI), and themed experiences like "3D Ocean of Light - Dolphins in VR" and "Space Explorers." Additionally, Horizon Worlds has integrated "Venues," a space dedicated to broadcasting live events such as concerts and sports, expanding entertainment options [51,52].
BKOOL, a virtual cycling simulator, offers immersive training experiences and has hosted notable events such as the competition between professional cyclists Chris Froome and Alberto Contador during the Virtual Giro d’Italia 2024 [53,54]. The Brooklyn Nets’ Netaverse, launched in 2022, uses digital twin and virtual reality technologies for immersive experiences during NBA games, allowing fans to experience the games from new perspectives and access real-time statistics [55,56]. Formula 1 in the MV has developed with various events, integrating NFTs, immersive experiences, virtual simulations, and video games. Among the most relevant events are F1 Delta Time (2019-2022), the F1 Virtual Grand Prix (2020), the collaboration with Roblox in F1® Arcade and Virtual Paddock, and the Monaco Grand Prix in the MV (2023). These events reflect F1’s incorporation into the digital world and its commitment to innovation in the virtual realm. The AIXR XR Awards, formerly known as the Virtual Reality Awards, celebrate the most significant achievements in the field of extended reality (XR), which encompasses VR, augmented reality(AR), and mixed reality (MR). These annual awards recognize technological innovations, immersive experiences, and applications that transform human interaction with technology.
On the other hand, Microsoft has developed Microsoft Mesh, a MR collaboration platform that allows interaction in digital environments through avatars or holograms. It works on devices like HoloLens 2 and smartphones, facilitating immersive meetings, collaborative design, technical training, as well as the creation of virtual events. Based on Microsoft Azure, Mesh uses AI and spatial computing to create realistic and secure experiences, positioning itself as a key tool for collaboration in the MV and its future integration with services like Microsoft Teams [57]. These platforms reflect the growing diversification of immersive experiences in the MV, offering users interaction, entertainment, and learning in advanced virtual environments [58,59]. Throughout this article, case studies will be analyzed that illustrate the use of these technologies and how leading companies and government agencies have adopted these strategies to maximize their potential. Since, from a socioeconomic perspective, these technologies present a series of challenges and opportunities. While the MV and virtual events democratize access to digital experiences, they also raise questions about security, data privacy, and their impact on employment. The creation of digital economies within the MV could redefine business models and innovate new forms of monetization, although it also carries risks associated with regulation and economic sustainability.
This study focuses on Alcoy, an industrial city with a textile tradition that began in the 19th century, driven by hydraulic energy. The modernization of the city accelerated with the arrival of steam, creating a strong economic and social fabric. In 1828, the first industrial studies derived from the Royal Cloth Factory were founded, and in the mid-19th century, the Elementary Industrial School was established, funded by public and private funds. In 1869, the first title of chemical expert was awarded, and in 1873, that of mechanical expert. At the beginning of the 20th century, the Higher School of Industry was founded, offering specializations in various industrial branches. In 1975, the School became part of the Universitat Politècnica de València (UPV) campus, expanding its educational offerings, which currently include seven degrees: Industrial Design, Computer Science, Business Administration, Electricity, Chemistry, Mechanics, and Robotics.
To commemorate the 50th anniversary of the integration of the Industrial School of Alcoy into the UPV, a comprehensive design effort has been carried out. For this experiment, first, a DT of the Alcoy campus of the Universitat Politècnica de Valencia (UPV) was designed. Next, and based on several photographs, the Ferrándiz and Carbonell buildings that make up the central part of the campus were modeled in 3D using 3dsmax 2025. Next, a brand image was created by designing the logo. A series of canvases with the commemorative image have been modeled in 3D, which are displayed on the facades of the Ferrándiz and Carbonell buildings, like a stage, as they are still installed in the urban space. With the Unity game engine, user interactions with the scene have been configured: the activation of animations, videos, among others, and they have been implemented in the template of the same program on the Spatial.io web portal, to be able to develop the digital twin in the metaverse. Spatial is a free application that provides its socialization interface, where we can customize our avatar and chat or voice with other users. In conclusion, the convergence of VR, the MV, DT, and the creation of virtual events is causing a paradigm shift, especially in the way we interact with the digital world. Its impact will continue to expand in the coming years, giving rise to new opportunities for innovation and global connectivity. This article is structured as follows: First, the proposed digital twin is developed in Section 2. Next, section 3 provides information on the usability of the interface and aspects of Unity and Spatial. All of the above, to conclude with a discussion and comments presented in Section 4 and Section 5.

2. Materials and Methods

2.1. Methodology for Designing Digital Twins in the Metaverse for Events

The methodology used to create the DT of this virtual event is illustrated in Figure 3. In the diagram, the first step is detailed as the interpretation of the CAD plans and the taking of photographs of the facades and decorative elements. The Campus maintenance service provided plans that only included information about the dimensions of the floors. Apparently, there are no elevation plans of the facades of the Ferrándiz and Carbonell buildings, so we had to use previously taken photographs as a reference. For this, around 300 photographs were collected to define all the important details of the buildings, as well as the vegetation, background images, and all the textures. The next step was to define the brand image for the 50th anniversary event of the union between the Industrial School of Alcoy and the Polytechnic University of Valencia (UPV). Once the logo was defined, the posters were designed and a commemorative video was edited where an animated 3D morphing was created, depicting the transformation between the UPV logo and the one proposed for the 50th anniversary. All this process is detailed in point 2. 2. Study of the event’s graphic design. Once the brand image is defined, the 3D modeling phase begins, using 3ds Max 2025, for the Ferrándiz and Carbonell buildings. The plans were imported, and based on their measurements in plan view, their height was extruded. It was necessary to create photomontages of the facades with Adobe Photoshop Version 25, in order to define the height details of the buildings (location of reliefs, windows, doors, etc.). Subsequently, auxiliary elements such as benches, trash cans, and trees were modeled to scale, all based on field photographs. Once the modeling was completed, the images were processed to obtain all the textures. This issue is detailed in section 2.3. 3D Modeling and Texturing of the Digital Twin.
For the implementation of the DT in the MV, we use the free tool Spatial.io, remember that it is a VR and AR platform that allows users to create and explore 3D virtual spaces. It is mainly used for remote collaboration, the design of immersive experiences, and also for exhibitions in the metaverse. This platform provides templates for the implementation of 3D content and animations, using the Unity game development tool. In this way, the next step was to export the 3D model to Unity. Within Unity, the 3D models and textures were then optimized to ensure smooth navigation for the future user, and the animations and interactions of the scene were defined. All this is explained in detail in section 2.4 Optimization of models and materials with Unity. Creation of interactions and animations. Next, the Unity template was exported to Spatial.io, where all the multimedia material was defined: videos and external links. To see this process in more detail, refer to section 2.5. SPATIAL.IO. Insertion of multimedia material. Once our DT was defined on the Spatial platform, it was then subjected to validation within the campus. In the first phase, version 1 was evaluated by the Graphic Expression department (Testers). Their improvement suggestions were incorporated into version 2, which was presented to 265 students of the Industrial Design Degree enrolled in the subjects of Computer-Aided Design, Presentation Techniques, and Simulation. The students tested the virtual scenario in pairs using an evaluation test. Finally, with this recorded data, the final version of the virtual event was improved, which was eventually presented at the 50th anniversary gala of the Industrial School of Alcoy within the Polytechnic University of Valencia, held on April 7, 2024. Once a space with 4 Meta Quest 3 glasses was set up, the attendees of the gala were able to try out the digital twin and the event’s customization. This was complemented by the presentation of a documentary video about the founding of the Industrial School and the 3D animation of the morphing between the current UPV logo and the one designed for the commemorative event. In the following sections, each of these stages is described in more detail, which will help to better understand the process of creating the digital twin of the Ferrándiz and Carbonell Buildings on the Alcoy campus for the celebration of the 50th anniversary of the Alcoy Campus at UPV.See Figure 4.

2.2. Study of the Event’s Graphic Design

As indicated, this project is being carried out to celebrate two commemorative events that are part of the recent history of the Alcoy Campus of the Universitat Politècnica de València. The first event corresponds to 2022, marking 50 years since the integration of the Alcoy School of Industrial Technical Engineering into the Universitat Politècnica de València, as a University School of Technical Engineering (Decree 1377/1972, of May 10; BOE, June 7, 1972), and the second event commemorates the 100th anniversary of the laying of the first stone of the Viaducto Building last century. The management of the educational center commissioned Silvia Sempere Ripoll for the artistic direction of the event, which involved the creation of the commemorative logo and all corresponding graphic elements that shaped the visual identity of the event Campus d’Alcoi 1972/2022 UPV. See Figure 5.
Below, the evolutionary sequence of the graphic design development of the commemorative logo is visualized, which began by taking the original coat of arms of the Universitat Politècnica de València with the emblematic legend EXTECHNICA PROGRESSIO as the starting point. To achieve this, all the compositional elements such as outlines, circles, and typography were removed. The crown and the four bars are reserved. With the four vertical parallel stripes of the four-barred flag, a formal overlay game is proposed, to which a selective subtraction exercise is applied, integrating it with the number 50, the years equivalent to the commemoration celebrated. See Figure 6.
For the crown, a simplification of the elements that compose it is carried out and the design of the shape is reinterpreted. First, its symmetry is eliminated to give it a more dynamic appearance; second, the four small spheres that top the crown are animated. The movement of the animation is designed so that each sphere moves at a different pace - mimicking the bounce of small balls. All of this, without losing the essence of the central identifying element of the official emblem of the educational entity. For the staging, several urban interventions were carried out in public spaces. One of them involved the installation of rigid structures that served as support for the placement of 9 digitally printed color canvases on PVC. As can be seen, the color range of the logo respects the official corporate colors: blue, red, yellow, and black. See Figure 7.
These large-format graphic elements were installed on the facades of the Ferrándiz and Carbonell buildings, which are part of the Alcoy Campus of the Polytechnic University of Valencia. See Figure 8.

2.3. Modeling and Texturing of the Digital Twin

The process of modeling the digital twin begins with a request to the maintenance service of the Alcoy campus for the building plans. The first hurdle appeared when we were informed that they only had floor plans, not elevation drawings of the facades. In this way, the CAD had to be imported into 3dsmax, and based on the known dimensions in plan, the photomontage of each facade was inserted to obtain the height of the building. We relied on real measurements in those areas of the ground floor where we could take measurements. See Figure 9.
From this point on, around 300 photographs were used to define the details of the facades, windows, reliefs, grilles, etc. this allowed us to define the volumes of the Ferrándiz and Carbonell facades using 3D modeling instructions for extrusion, sweeps, and boolean operations. See Figure 10.
Once the volumes of the facades were defined, it was necessary to work again on the photographs taken to construct the existing decorative objects: benches, trash cans, planters, sculptures, etc. as in the previous case, we relied on taking measurements of those elements we could access. See Figure 11 and Figure 12.
The digital twin modeling was completed with flagpoles, spaces for the 50th-anniversary commemorative banners, and a couple of conference stands with a rear screen to display corporate videos of the 50th-anniversary event and the Alcoy campus. To give it more realism, a flock of pigeons and a zeppelin with the 50th anniversary logo were added, referencing the Japanese series "Alice in Borderlands." Once the modeling was finished, texturing began, cutting out images of posters, manholes, doors, etc., from the taken photographs. In this case, we had to work from Adobe Photoshop version 25 to crop each texture part. The Alpha channels had to be worked on for those textures with transparent parts like backgrounds or grilles. Similarly, normal maps had to be generated to add more realism to those textures with roughness, such as the cement in the square. See Figure 13. To texture the 3D models, the Standard (Legacy) material type was used to place the Diffuse, Opacity, and Bump (Normal) maps. In those 3D models with different textures, the Multi/Sub-Object material type was used to identify the parts of the model and their corresponding textures by ID. See Figure 14. The trees (two elms and two olive trees) were imported from 3D model libraries and already came with opacity maps to define their leaves. To finish the texturing, 2 planes were created at both entrances to the square with applied photographs of the Museo Cada and the Salesian school, adjacent to the Ferrándiz and Carbonell buildings. These planes will delimit the visitable area of the immersive space. See Figure 15. It has been planned to animate the appearance of the event’s banners, so they have been modeled in 3dsmax, but they will be textured in Unity.

2.4. Optimization of Models and Materials with Unity. Creation of Interactions and Animations

Unity is a real-time rendering engine widely used in the interactive software development industry, standing out for its versatility and ability to generate highly immersive three-dimensional environments. Its relevance in the construction of MV-oriented applications lies in its architecture optimized for the creation and management of interactive virtual spaces, allowing the implementation of immersive experiences on multiple platforms. Unity technology enables the development of dynamic virtual worlds, positioning it as an essential tool in the production of content for video games, VR, AR, and MR, fundamental disciplines in the configuration of the MV. Likewise, its graphics engine and advanced scripting systems allow for the customization of digital experiences with a high degree of interaction and adaptability to various devices and computational environments. In the context of developing immersive experiences and three-dimensional environments, Unity maintains a close relationship with Spatial.io, although both platforms perform different functions within the digital content generation ecosystem. Spatial.io employs Unity as the technological infrastructure for building its virtual environments, leveraging its capability for real-time rendering and the simulation of complex interactions. Thanks to this integration, the environments generated in Spatial.io present a high degree of visual realism and optimized interaction capabilities for social, collaborative, and commercial applications within the MV. While Unity provides the computational foundation for the generation and manipulation of these three-dimensional environments, Spatial.io facilitates their accessibility and use within a multi-user interaction context.
In the implementation of the digital twin of the platform, version 2021.3.21f1 of Unity has been used. For the correct configuration of the environment, it is necessary to import the UI Menu Spatial package into the project. This import provides a series of preconfigured assets that can be used as a reference in the construction of the virtual environment. It is recommended to open the corresponding level and transfer all the elements of the hierarchy to the development project, with the aim of using them as a structural template for the implementation of the final design. For this reason, the integration of Spatial in Unity has been chosen over other well-known game engines like Unreal or Nvidia Omniverse. For this scenario, a component developed by Spatial has been used to enhance user interactivity with the space. It is the Quest component, which allows us to create a mission system with as many objectives as we want. In our case, the mission system has been used to provoke the user’s call to action and to encourage the complete exploration of the scenario. See Figure 16.
Mission 1: The user must pass through the illuminated cylinder to discover the commemorative banners of the 50th anniversary event. To do this, they must activate the collision trigger defined in the scene. Upon collision, it will activate the banner animation and indicate to the mission system that the objective has been completed, thereby unlocking the next mission. See Figure 17.
Mission 2: The user must find the three entrances to the buildings of the Alcoy Campus (1 in the Ferrándiz building and 2 in the Carbonell building). A similar element to the previous one (Trigger Event) is reused to register the collision and increase the mission log counter. See Figure 18.
Mission 3: The user must get on stage to complete the final mission. In this case, a Trigger Event collision zone has been set up on the stage to activate the event that checks if the user has completed the objective. Upon completion, a celebration message will appear. See Figure 19.
To optimize navigability, all the textures used in the scene were converted to base 2 textures, 1024 x 1024 pixels. Both modeling and texturing must be handled with video game optimization in mind, to avoid issues later in gameplay in the MV. To add more interactivity to the scene, seating areas have been included on benches and planters using the Seat HotSpot component. This adds more realism to the scene and makes it easier, in the case of giving talks or training sessions in the virtual stage, for people to remain seated. See Figure 20.

2.5. SPATIAL.IO. Insertion of Multimedia Material

And in the final phase of the creation process, we export the scene from Unity to Spatial. From the Spatial Portal tab in Unity, we access the Creator Kit menu. From there, we can configure the name of the scene and the thumbnail that will be displayed when it starts in Spatial. Previously, we will need to register with the details of our Spatial account. Using the Publish button, the process of publishing on the network will begin. Normally, after 20-30 minutes, we will receive a confirmation email indicating that our scenario is already published. See Figure 21.
Spatial.io is an advanced platform that redefines virtual collaboration by creating immersive 3D spaces where users can meet, work, and interact. Leveraging the power of AR and VR, Spatial.io offers a unique and intuitive environment that enhances communication and productivity. It allows the design of personalized virtual spaces where users can create virtual environments tailored to their needs, such as galleries to showcase art, virtual offices for team collaboration, or interactive educational spaces. The platform allows the integration of 3D models and assets from various design programs, making it easier for artists, designers, and architects to incorporate their creations into the virtual environment. Additionally, Spatial.io supports simultaneous collaboration of multiple users in the same virtual space, ideal for brainstorming sessions, joint projects, or collaborative design reviews. Users can build immersive presentations and demonstrations, useful for proposals, educational seminars, or product exhibitions, offering more engaging experiences. Every aspect of the virtual space is customizable, from the layout and design to specific features and functionalities, ensuring that the environment adapts to the unique needs of each project. Spatial.io integrates with various tools and platforms, allowing the incorporation of documents, videos, and interactive widgets into virtual spaces, which enhances the functionality and interactivity of the environment. The platform allows for the hosting of events, workshops, and experiences within virtual spaces, with robust event management features, including spatial audio and attendee management, to ensure smooth and engaging virtual experiences. Virtual spaces on Spatial.io can be easily shared via links, making it simple to invite collaborators, clients, or audiences to explore and interact in the space without complex setup processes. Additionally, Spatial.io offers a wide range of premium templates without the need for coding, allowing creators to design interactive spaces easily. The platform is compatible with VR, AR, web, and mobile devices, ensuring a smooth and accessible experience from any device.
Spatial has been chosen over other platforms like Roblox or Meta Horizon because it is specifically designed to create professional and collaborative environments in the MV, making it ideal for virtual meetings, digital art galleries, exhibitions, and business events. It allows importing high-quality 3D models and uses advanced rendering that provides more realistic graphics compared to the more cartoonish and simplified style of Roblox. Additionally, it is optimized for VR and AR experiences, making it an ideal choice for immersive presentations, virtual tours, and more advanced interactive experiences. Spatial.io allows users to create experiences without the need for programming, thanks to no-code building tools and pre-designed templates. Roblox, for example, requires knowledge of Lua (Roblox Studio) to develop more advanced experiences, which represents a barrier for users without programming experience. Spatial.io is accessible through web browsers, VR devices (Meta Quest, HTC Vive), and mobile devices, without the need to install additional software, whereas Roblox requires installing the specific application for each platform and does not have direct browser support for advanced experiences. Ultimately, we chose Spatial.io because it is the best option for creating our platform for virtual events, collaborative work, exhibitions, or high-quality reality environments. On the other hand, as a platform for video games or monetization, Roblox would have been a better option. A low-angle view of the initial point of the stage can be seen in Figure 22.
In this final section, we upload all the multimedia content online, in our case, the videos from the two stages. One with the presentation of the Alcoy campus and another with the commemorative video of the 50th anniversary. It’s a bit annoying to work with Spatial because it doesn’t allow inserting this content from Unity. You have to insert a component called Empty Frame in those places where we want to add images, PowerPoint presentations, PDFs, or videos, and then, once the scene is published, insert them online. The advantage is that we can edit this content without having to republish the scenario. See Figure 23.
One of the main advantages of Spatial.io is its MV-adapted interface with socialization tools such as text or voice chat, along with the inclusion of emoticons or dance animations for our avatar. From its Share icon, we can make calls to the entire community via email or social media to announce an event. As we have seen in the introduction, many companies have created their MV with a very vertical structure with high implementation and maintenance costs. We believe that Spatial.io is a more horizontal and accessible tool for all types of audiences, allowing any business, entity, or organization to create their own space in the MV and share it with the community.

2.6. Research Design

To validate the versions of the digital twin in this research, Meta Quest 3 glasses, a PC, and an iPhone, model X, were used. Before each test, using the VR glasses, a technician conducted a simple tutorial, explaining the purpose of the test. The way to place the glasses and use the controllers and their buttons was explained, as well as how to navigate through the space and its menus. Specific tasks were established to evaluate the experience, and the tools for socialization and avatar creation were described, as well as those for image and video capture creation. Users were warned about the possibility of feeling dizzy and were advised to make movements slowly and gently. During the test, the technician virtually accompanied each user with their own avatar, to prevent them from getting lost in the journey and to ensure they knew how to correctly use the Spatial tools. Once the user became comfortable, they were allowed to navigate freely through the environment, without any time limit. In this way, three phases were carried out in the test. A first phase with directed tasks where specific actions were assigned to evaluate key functionalities. A second, where the user was allowed to navigate freely through the environment without restrictions, and a third with real-time feedback, where reactions were observed and verbal comments were recorded. The test was concluded the moment the user requested to leave the scenario. The same process was carried out for the application tests from the laptop and the iPhone. They were introduced into the same scenario, so they would coexist with the person who was simultaneously taking the test with the Meta Quest 3. The tests were conducted individually on each of the platforms. For the tests with the glasses and the mobile phone, the campus wifi network was used, at a speed of 500Mb/s, and for the PC, the LAN network of 1GB/s was used. In the first phase (Tester version), the digital twin was evaluated by 10 colleagues from the Department of Graphic Expression. With the feedback from our colleagues, a second version (Student Version) was created, which was evaluated by 265 students from the Industrial Design Degree.

2.7. Data Analysis

To validate the methodology employed in the trials, interviews were conducted with the students and tests were carried out to evaluate the degree of satisfaction with the immersive experience using VR glasses (Case 1), with the PC (Case 2), and with the iPhone X (Case 3). Each user was given two standard questionnaires: the IPQ (Igroup Presence Questionnaire) and the SUS (System Usability Scale). The iGroup Presence Questionnaire (IPQ) is a questionnaire designed to measure people’s sense of presence in virtual environments, such as VR. Presence refers to the subjective sensation of "being" truly in a virtual environment, beyond simply interacting with a digital simulation. This questionnaire was developed to evaluate this immersive experience and is widely used in VR studies and interactive digital environments. The IPQ is structured into 14 questions to evaluate different aspects of the presence experience, such as: 1.- Spatial Presence: Evaluates the perception of "being physically in another place" while in the virtual environment. 2.- Immersive Presence: Measures the level of concentration and interest in the virtual environment. Greater participation suggests that the user has "immersed" themselves emotionally in the experience. 3.- Presence Realism: Captures the perception of realism in the virtual environment. Evaluate if users feel that the environment is "believable" or similar to the real world. 4.- General Presence: A more general measure of presence that complements the other dimensions and allows for a global view of the presence experience. The IPQ is used to evaluate the effectiveness of virtual environments in generating an immersive experience, compare different VR or AR technologies to see which ones generate a greater sense of presence, and improve the design of immersive experiences in video games, training simulations, exposure therapy, and educational applications. The 14 questions of the IPQ questionnaire are detailed in Table 1. The measurement method used for this questionnaire is a 7-point Likert scale. In this article, the variables of each test are analyzed, using a standard deviation of the average scores of the questions in each test.
For its part, the SUS is a 10-question questionnaire designed to evaluate the usability of systems and technological products, such as websites, applications, and devices. It was developed in 1986 by John Brooke and is widely used because it is quick, easy to administer, and effective in obtaining a general measure of usability. See Table 2. The 10 questions are detailed in Table 2, and users must answer them after interacting with the system or product. The questions alternate between positive and negative statements to reduce bias and cover aspects of system usability and learning. Each statement is evaluated on a 5-point Likert scale, ranging from 1 (strongly disagree) to 5 (strongly agree). The SUS provides a score from 0 to 100. It does not represent a percentage, but rather allows for the comparison of the relative usability of different systems and determines whether a system is easy or difficult to use. (See Figure 24). The formula can be expressed mathematically as follows:
In this formula, the variables from Q1 to Q2 are the answers to the 10 questions. To evaluate the responses of all the tests, the mean and standard deviation are calculated. See Table 2.

3. Results

In this section, we describe the application of the event for the MV generated with the digital twin. In Figure 25, a pair of users can be seen taking the test. The two users previously completed the technician’s tutorial to familiarize themselves with the application. Upon entering the immersive space, users could freely navigate the digital twin, interact with other users, and complete missions. They were connected with the same Spatial.io account so they could interact with each other simultaneously from different devices. Figure 26, Figure 27, Figure 28 and Figure 29 show details of the virtual stage of the Digital Twin, where the facades of the Ferrándiz and Carbonell buildings can be seen, as well as all the elements of the square, urban furniture, statue, etc.… you can see the elements that identify the event, the banners, and the zeppelin with the 50th anniversary logo. You can also see how the avatar can sit on benches or planters, and how it can go about completing missions to uncover the final celebration. It is important to highlight the photorealistic finish of the application, as well as its fluidity of movement. At no point were any stutters or hitches detected during the tests. The frames per second never dropped below 25. The realism and the sense of immersion were the most commented on by the users.
Different behaviors were detected: in the case of the VR glasses user, the sense of immersion was greater, as well as a certain dizziness. He spent more time observing the details of the modeling and textures. In contrast, the PC and iPhone user spent more time experimenting with the running and jumping tools, as well as the socialization features of the Spatial platform: chat, performing dances, and sending emoticons. The VR user focused more on exploring and moving through the space, discovering what was around them. Between both users, the interaction was smooth. They sent emoticons and text messages, engaging in a conversation about the appearance of the stand, thus enhancing the virtual experience. Difficulties were detected on the part of the VR user when handling some menus and icons, as well as the aforementioned "dizziness" if the navigation was too abrupt.
The analysis was conducted for a total of 265 users. The demographic balance was as follows: 134 indicated they were men and 131 indicated they were women. By age, the group of students was in the range between 20 and 24 years old. 56.98% stated that they used these new technologies daily, for leisure in video games or watching movies. 18.86% stated that they used them occasionally. Finally, they were asked if they had already had experiences with the MV, and 37.73% responded affirmatively. In the following figures, the data obtained from the IPQ can be appreciated, In Figure 30, the mean and standard deviation for each question of the IPQ are indicated. Each subscale of the questionnaire is indicated in percentage, with respect to the mean and the standard deviation obtained in the questionnaires.See Figure 30.
The obtained data indicates that 92.86% of the respondents experienced a General Presence, with a deviation of 0.76%. This indicates that the users had a high sense of immersion within the Visual Environment (VE). Regarding Spatial Presence, a score of 83.16% with a deviation of 0.88% was obtained, indicating that users felt physically present within the digital twin of the stand. 82.91% of the respondents, with a deviation of 0.92%, stated that they had a high level of engagement within the VE. Regarding Experienced Realism, 87.76% of respondents, with a deviation of 1.04%, stated that they felt immersed within the VE. These data support our first objective, to create a digital twin of the Alcoy campus so that students, professors, and visitors can hold workshops, conferences, and events to enhance the dissemination of knowledge. Regarding the data obtained from the SUS questionnaire, the overall perceived usability scored 88.39 out of 100 (range from 74 to 100 with a standard deviation of 8.7), indicating a very high level of usability for the proposed application. In Figure 31, the results of each question from the SUS questionnaire, defined in Table 2, are visually represented. It is worth noting that the majority of users expressed their willingness to use this application frequently, highlighting its ease of use. The tested students indicated that the setting and missions of the virtual event were well integrated, ensuring a coherent and smooth experience. Additionally, the respondents indicated confidence in their interactions with the application. See Figure 31.

4. Discussion

The public presentation of the alpha version of the developed Digital Twin took place on April 7, 2024, at the 50th Anniversary Gala of the union between the Industrial School of Alcoy and the Polytechnic University of Valencia. A small stand was prepared with a PC and Meta Quest 3 glasses, both connected to the Alcoy campus wifi network. The two devices were connected to the same Spatial account, and the PC’s video output was connected to a 50-inch television screen so that the audience present at the gala could see the same virtual space as the people testing the application. See Figure 32.
A technician informed each participant beforehand about the operation of the glasses and their controls. He entered with the users, using a mobile device (iPhone X), and acted as a guide to explain the functioning of the DT application. Once the technician determined that the user was navigating the interface with ease, they were given complete freedom to test each of the functions. The DT application of the Alcoy campus was presented as a base platform for the dissemination of knowledge by each of the departments through the virtual space. The creation of conferences, workshops, and virtual events facilitates access to this knowledge for any user who connects online to the application from Spatial.io. After the experience, a technical questionnaire was provided to the attendees, with five qualitative questions and two quantitative questions. See Table 3. Forty random users aged between 16 and 70 completed this survey, which aimed to measure their level of satisfaction during the experience and their assessment of the quality of the DT.
Users could respond on a scale from 1 ("Very Unsatisfactory") to 5 ("Very Satisfactory"). The average score obtained was 4.5, with a standard deviation of 0.7, which indicates that the user was satisfied with the experience. Only six respondents rated any question with a value of 1, as they encountered difficulties navigating with the VR glasses. 85% of the users expressed satisfaction with the use and navigation of the digital twin and its missions, and only two had problems completing them. This revealing data will be used to improve both the explanatory tutorial and to add visual aids within the virtual environment, to help the user better manage the internal tools of the application. Overall, the experience for most users was satisfactory, but in discussions following the experience, improvements were suggested, such as voice-over and interactive aids, a virtual tour explaining the application’s functionalities, and graphical enhancements. At first, the comments triggered a surprise effect in the user upon entering the application, as they saw a highly detailed replication of what they knew in reality. As the user explored the space, improvement options were indicated. "It’s very fun and useful for visualizing the square and the campus buildings in real time, many activities can be organized here, but it makes you a bit dizzy." "Can this type of application be used for other events like concerts or video games?" "I was a bit lost at first, but then I felt completely integrated into the square." It’s a very useful tool." These valuable responses were used for subsequent improvements, especially in the previous tutorial. Explanatory text was removed from it and more graphic information was added. Explanatory signs were also inserted to indicate the functions of the keyboard to the user. In Figure 33, some examples of the new explanatory signs added can be seen.
These changes aim to optimize navigation and the user experience within the immersive environment. In the future, the expansion of the number and diversity of users is projected, as well as the use of VR devices, with the purpose of analyzing the interaction between multiple individuals and the possible communicative dynamics that may arise through chat, voice, or other means. Likewise, the updates and improvements implemented in the Spatial Toolkit Creator will be evaluated and, if deemed relevant, incorporated into the corresponding applications. It is worth noting that the applications developed in this study have high potential for the educational field, as they enable the dissemination of knowledge in an interactive and innovative manner [60]. As future lines of research, the industrial application of these DT is foreseen [61,62], as well as their application in sporting events within the University [63,65]. Their application in the field of health, such as coordination with medical leave for students and professors [66,67] and expansion to other campus buildings [68,71]. However, it is essential to consider aspects related to user privacy and data protection. In the current version of the application, the tests have been conducted anonymously; however, in the future, the collection of user data may be required by clients, making compliance with current data protection regulations essential. While this technology promotes global connectivity, accessibility, and inclusion, it is necessary to emphasize that it will not completely replace human interactions in their entirety. On the other hand, the environmental impact and sustainability of display devices in the future must be considered. The evolution of these devices should be oriented towards more eco-friendly and recyclable materials, following a trend similar to that observed in the mobile phone industry, where more compact devices with sustainable materials have been developed. In this regard, Meta has developed a prototype of VR glasses, called Orion, see Figure 34, which feature a lighter, more efficient, and sustainable design compared to the current Meta Quest 3. Although their initial cost is high, they are expected to eventually become everyday devices capable of integrating multiple functionalities, replacing devices such as mobile phones, smartwatches, and headphones, with the incorporation of VR and AR experiences.

5. Conclusions

This article analyzes the application of new technologies in the modernization and optimization of the educational and events sectors. Through the implementation of the *Spatial Creator Toolkit v1.60*, released in October 2024, its usefulness as a tool for content creation in the MV has been demonstrated, specifically in the generation of a digital twin of the Alcoy campus. The empirical tests conducted have validated the methodology for the reproduction and design of buildings, complemented by immersive simulation tools and technologies typical of video game environments. Moreover, it has been demonstrated that these technologies not only serve as a resource for knowledge dissemination but also enhance participation, immersion, and the overall user experience. The results of this study suggest that these technologies are applicable to various fields, especially in educational events, increasing the appeal of subjects and promoting gamification-based learning. Likewise, it has been demonstrated that the use of digital twins contributes to the improvement of usability and user experience, adding value to the developed virtual environments. Future lines of research will be directed towards the application of this knowledge in other campus departments, such as materials science or electrical engineering, with the aim of further customizing workshops, conferences, and events, optimizing the transmission of scientific information. The potential of these technologies is considerable, and their implementation in various sectors is expected to favor the transition towards Industry 5.0. This new industrial stage emphasizes collaboration and synergy between humans and automated systems, promoting more efficient production and an industry focused on human well-being and sustainability. Unlike Industry 4.0, characterized by automation, digitalization, and artificial intelligence, Industry 5.0 incorporates human creativity and personalization into industrial processes, integrating considerations of social well-being and environmental sustainability. Another objective of future research is the development of a manual of best practices for content creation in the MV. This guide will establish the fundamental principles for the design of virtual environments, addressing aspects such as composition, color harmony, and usability. Analogous to architectural or industrial design, where factors such as color, materials, ergonomics, and functionality are studied, the construction of spaces in the MV will require the adoption of specific design standards. The application of these guidelines will allow for differentiation between conventional virtual environments and those that provide an exceptional experience.

Author Contributions

“Conceptualization, V.J. and S.S.; methodology, V.J.; software, V.J.; validation, V.J.,S.S., S.F.; formal analysis, V.J and S.F..; investigation, V.J ans S.S.; resources, V.J.; data curation, V.J.,S.S and S.F..; writing—original draft preparation, V.J.; writing—review and editing, V.J., S.S. and S.F.; visualization, V.J.; supervision, S.S. and S.F.; project administration, V.J.; All authors have read and agreed to the published version of the manuscript.”

Funding

Funding for open access charge: Universitat Politècnica de València.

Institutional Review Board Statement

"Not applicable".

Informed Consent Statement

“Not applicable”.

Data Availability Statement

Data is contained wiyhin the article.

Conflicts of Interest

`The authors declare no conflicts of interest.”

Abbreviations

The following abbreviations are used in this manuscript:
DT Digital Twins
MV Metaverse
VR Virtual Reality
AR Augmented Reality
MR Mixed Reality
AI Artificial Intelligence
IoT Internet if Things
NFTs Non-Fungible Tokens
VE Virtual Environment

References

  1. Dayoub, B.; Yang, P.; Omran, S.; Zhang, Q.; Dayoub, A. Digital Silk Roads: Leveraging the Metaverse for Cultural Tourism within the Belt and Road Initiative Framework. Electronics 2024, 13, 2306. [CrossRef]
  2. Buhalis, D.; Lin, M.S.; Leung, D. Metaverse as a driver for customer experience and value co-creation: implications for hospitality and tourism management and marketing. 621 International Journal of Contemporary Hospitality Management 2023, Vol. 35 No. 2, 701–716 . [CrossRef]
  3. Xuewei, W.; Xiaokun, Y. Current situation and development in applying metaverse virtual space in field of fashion[J].Journal of Textile Research, 2024, 45(04): 238-245.
  4. Wu, R.; Gao, L.; Lee, H.; Xu, J.; Pan, Y. A Study of the Key Factors Influencing Young Users’ Continued Use of the Digital Twin-Enhanced Metaverse Museum. Electronics 2024, 13, 2303. [CrossRef]
  5. Hashash,O;Chaccour,C;Saad,W;Sakaguchi,K;Yu,T; (2022). Towards a Decentralized Metaverse: Synchronized Orchestration of Digital Twins and Sub-Metaverses.
  6. S. K. Jagatheesaperumal et al., "Semantic-Aware Digital Twin for Metaverse: A Comprehensive Review," in IEEE Wireless Communications, vol. 30, no. 4, pp. 38-46, August 2023, .
  7. Aung,N;Dhelim,S;Ning, H. et al. Web3-enabled Metaverse: The Internet of Digital Twins in a Decentralised Metaverse. TechRxiv. January 02, 2024.
  8. Grieves, M., Hua, E.Y. (2024). Defining, Exploring, and Simulating the Digital Twin Metaverses. In: Grieves, M., Hua, E.Y. (eds) Digital Twins, Simulation, and the Metaverse. Simulation Foundations, Methods and Applications. Springer, Cham.
  9. Adnan,M; Ahmed,I;Iqbal,S;Rayyan Fazal,M;Siddiqi,S.J; Tariq, M. (2024) Exploring the convergence of Metaverse, Blockchain, Artificial Intelligence, and digital twin for pioneering the digitization in the envision smart grid 3.0, Computers and Electrical Engineering, Volume 120, Part B, 109709, ISSN 0045-7906, . [CrossRef]
  10. Cruz, M.; Oliveira, A. Where Are We Now?—Exploring the Metaverse Representations to Find Digital Twins. Electronics 2024, 13, 1984. [CrossRef]
  11. Xing,H; Minyu,C; Yuezhong,T; Qian,A; Dongxia,Z; System Theory Study on Situation Awareness of Energy Internet of Things Based on Digital Twins and Metaverse (I): Concept, Challenge, and Framework[J]. Proceedings of the CSEE, 2024, 44(2): 547-560.
  12. Deng, B., Wong, I.A. and Lian, Q.L. (2024), "From metaverse experience to physical travel: the role of the digital twin in metaverse design", Tourism Review, Vol. 79 No. 5, pp. 1076-1087. [CrossRef]
  13. Kruachottiku,P.;Phanomchoeng,G.;Cooharojananone,N.;Kovitanggoon,K;Tea-makorn,P; "ChulaVerse: University Metaverse Service Application Using Open Innovation with Industry Partners," 2023 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM), Singapore, Singapore, 2023, pp. 0294-0299, .
  14. Shatilov,K;Alhilal,A;Braud,T; Lik-Hang Lee, Pengyuan Zhou, and Pan Hui. 2023. Players are not Ready 101: A Tutorial on Organising Mixed-mode Events in the Metaverse. In Proceedings of the First Workshop on Metaverse Systems and Applications (MetaSys ’23). Association for Computing Machinery, New York, NY, USA, 14–20.
  15. Meliande,R.;Ribeiro,A.;Arouca,M.;Amorim,A.;Pestana,M.;Vieira, V.(2024). Meta-Education: A Case Study in Academic Events in the Metaverse. In Anais do XIX Simpósio Brasileiro de Sistemas Colaborativos, (pp. 28-41). Porto Alegre: SBC.
  16. Jeffrey,P.;Khatri,H.;Gauthier,C.;Coffey,B.;Garza,J.(2024) “Building a University Digital Twin.” 2024 IEEE International Symposium on Emerging Metaverse (ISEMV): 17-20.
  17. Sim, J.K.; Xu, K.W.; Jin, Y.; Lee, Z.Y.; Teo, Y.J.; Mohan, P.; Huang, L.; Xie, Y.; Li, S.; Liang, N.; et al. Designing an Educational Metaverse: A Case Study of NTUniverse. Appl. Sci. 2024, 14, 2559. [CrossRef]
  18. Sim, J.K.; Xu, K.W.; Jin, Y.; Lee, Z.Y.; Teo, Y.J.; Mohan, P.; Huang, L.; Xie, Y.; Li, S.; Liang, N.; et al. Designing an Educational Metaverse: A Case Study of NTUniverse. Appl. Sci. 2024, 14, 2559. [CrossRef]
  19. Shanthalakshmi Revathy, J.;J. Mangaiyarkkarasi.(2024) "Integration of Metaverse and Machine Learning in the Education Sector." Impact and Potential of Machine Learning in the Metaverse, edited by Shilpa Mehta, et al., IGI Global,pp. 74-99.
  20. Yun, S.-J.; Kwon, J.-W.; Lee, Y.-H.; Kim, J.-H.; Kim, W.-T. A Learner-Centric Explainable Educational Metaverse for Cyber–Physical Systems Engineering. Electronics 2024, 13, 3359.
  21. Sai,S.;Prasad, M.;Garg,A.;Chamola,V.;(2024) "Synergizing Digital Twins and Metaverse for Consumer Health: A Case Study Approach," in IEEE Transactions on Consumer Electronics, vol. 70, no. 1, pp. 2137-2144, Feb. 2024, .
  22. Tao,F.;Zhang,M.;Nee,A.Y.C.(2019) Digital Twin Driven Smart Manufacturing, Academic Press, Page iv,ISBN 9780128176306, .
  23. Adamenko,D.;Kunnen,S.;Pluhnau,R.;Loibl,A.;Nagarajah.,A; (2020) Review and comparison of the methods of designing the Digital Twin,Procedia CIRP, Volume 91, Pages 27-32, ISSN 2212-8271, . [CrossRef]
  24. Hosseini,S.;Abbasi,A.;Magalhaes,L.G.;Fonseca,J.C.;da Costa,N.M.C.;Moreira,A.H.J.;Borges,J. (2024) Immersive Interaction in Digital Factory: Metaverse in Manufacturing. Procedia Comput. Sci. 232, C (2024), 2310–2320. [CrossRef]
  25. Lee, J.; Kundu, P. (2022) Integrated cyber-physical systems and industrial metaverse for remote manufacturing. Manufacturing Letters. Vol 34. [CrossRef]
  26. Abdlkarim, D.;Di Luca, M.;Aves,P.(2024).A methodological framework to assess the accuracy of virtual reality hand-tracking systems: A case study with the Meta Quest 2.Behav Res 2024, Vol. 56 , 1052–1063.
  27. Tu,X.;Ala-Laurinaho,R.;Yang,C.;Autiosalo,J.;Tammi,K.(2024) Architecture for data-centric and semantic-enhanced industrial metaverse: Bridging physical factories and virtual landscape, Journal of Manufacturing Systems, Volume 74, 2024, Pages 965-979, ISSN 0278-6125, . [CrossRef]
  28. L. Ren et al., "Industrial Metaverse for Smart Manufacturing: Model, Architecture, and Applications," in IEEE Transactions on Cybernetics, vol. 54, no. 5, pp. 2683-2695, May 2024, .
  29. Patterson E.A.(2024) Engineering design and the impact of digital technology from computer-aided engineering to industrial metaverses: A perspective. The Journal of Strain Analysis for Engineering Design. 2024;59(4):303-305. [CrossRef]
  30. Kaigom,E.; 2024. Metarobotics for Industry and Society: Vision, Technologies, and Opportunities, IEEE Transactions on Industrial Informatics, Vol, 20 , ISSN=1941-0050, .
  31. Fernández-Caramés, T.M.;Fraga-Lamas, P."Forging the Industrial Metaverse for Industry 5.0: Where Extended Reality, IoT, Opportunistic Edge Computing, and Digital Twins Meet," in IEEE Access, vol. 12, pp. 95778-95819, 2024, .
  32. Sai,S.;MPrasad,M.;Upadhyay,A,;Chamola,VHerencsar,V. "Confluence of Digital Twins and Metaverse for Consumer Electronics: Real World Case Studies," in IEEE Transactions on Consumer Electronics, vol. 70, no. 1, pp. 3194-3203, Feb. 2024, . [CrossRef]
  33. Allam, Z.; Sharifi, A.; Bibri, S.E.; Jones, D.S.; Krogstie, J. The Metaverse as a Virtual Form of Smart Cities: Opportunities and Challenges for Environmental, Economic, and Social Sustainability in Urban Futures. Smart Cities 2022, 5, 771-801. [CrossRef]
  34. Zainab, H.e.;Bawanay,N. Z. "Digital Twin, Metaverse and Smart Cities in a Race to the Future," 2023 24th International Arab Conference on Information Technology (ACIT), Ajman, United Arab Emirates, 2023, pp. 1-8, .
  35. Gallist, N.; Hagler, J.Tourism in the Metaverse: Digital Twin of a City in the Alps. MUM ’23: Proceedings of the 22nd International Conference on Mobile and Ubiquitous Multimedia 2024, 568–570.
  36. Remolar, I.; Chover, M.;Quirós, R.;Gumbau, J.; Castelló, P.;Rebollo, C.;Ramos, J. F. Design of a Multiuser Virtual Trade Fair Using a Game Engine. Transactions on Computational Science 2011, Vol.12.
  37. Martín Ramallal,P.;Sabater-Wasaldúa,J.;Ruiz-Mondaza,M.(2022). Metaversos y mundos virtuales, una alternativa a la transferencia del conocimiento: El caso OFFF-2020. Fonseca, Journal of Communication, (24), 87–107.
  38. Valaskova,K.;Vochozka,M.;Lăzăroiu,G.(2022).“Immersive 3D Technologies, Spatial Computing and Visual Perception Algorithms, and Event Modeling and Forecasting Tools on Blockchain-based Metaverse Platforms,” Analysis and Metaphysics 21: 74–90.
  39. Zhu,H.(2022).A Metaverse Framework Based on Multi-scene Relations and Entity-relation-event Game. 2203.10424.
  40. Samarnggoon, K.; Grudpan, S.; Wongta, N.; Klaynak, K. Developing a Virtual World for an Open-House Event: A Metaverse Approach. Future Internet 2023, 15, 124. [CrossRef]
  41. Choi,M.;Choi,Y.;Nosrati,S.;Hailu,T. B.;Kim, S.(2023). Psychological dynamics in the metaverse: evaluating perceived values, attitude, and behavioral intention in metaverse events. Journal of Travel & Tourism Marketing, 40(7), 602–618. [CrossRef]
  42. Travassos,A.;Rosa,P.;Sales,F. Influencer Marketing Applications Within the Metaverse. Influencer Marketing in the Digital Ecosystem 2023 Vol. 82, 117–131.
  43. Bonales Daimiel,G.;Pradilla Barrero,N. Urban art in the metaverse.New creative spaces. Street Art and Urban Creativity 11, n.o 1 (2025): 137-50.
  44. Rafique,W.;Qadir,J.(2024). Internet of everything meets the metaverse: Bridging physical and virtual worlds with blockchain. Elsevier Science Publishers B. V.. Vol. 54.
  45. He, W;Li,X.;Xu,S.;Chen,Y.;Sio,C.;Kan,G.L.;Lee,L. (2024) MetaDragonBoat: Exploring Paddling Techniques of Virtual Dragon Boating in a Metaverse Campus. Association for Computing Machinery. Isbn 9798400706868.
  46. Barta,S.;Ibáñez-Sánchez,S.;Orús,C.;Flavián,C.(2024).” Avatar creation in the metaverse: A focus on event expectations”. Computers in Human Behavior. Vol 156. 108192.ISSN 0747-5632. [CrossRef]
  47. Ki,C.;Chong,S.;Aw,E.;Lam,M.; Wong, C. Metaverse consumer behavior: Investigating factors driving consumer participation in the transitory metaverse, avatar personalization, and digital fashion adoption. Journal of Retailing and Consumer Services 2024, Vol. 82. 707 .
  48. Lau, J. X.;Ch’ng, C. B.;Bi, X.;Chan, J. H.(2024). Exploring attitudes of gen z towards metaverse in events industry. International Journal of Sustainable Competitiveness on Tourism, 3(01), 49–53. [CrossRef]
  49. Nesaif,B.M.R.B.(2024). The Influence of Virtual Events on Metaverse Commercial Real Estate Values: A Review. International Journal of Business Strategies, 9(1), 31–48.
  50. Lo,F.;Su,C.;Chen,C.(2024).Identifying Factor Associations Emerging from an Academic Metaverse Event for Scholars in a Postpandemic World: Social Presence and Technology Self-Efficacy in Gather.Town. Cyberpsychology, Behavior, and Social Networking. Vol 27. ary Ann Liebert, Inc., publishers.
  51. Flavián,C.;Ibáñez-Sánchez,S.;Orús,C.;Barta,S. (2024). “The dark side of the metaverse: The role of gamification in event virtualization”. International Journal of Information Management. Vol. 75. 102726, ISSN 0268-4012. [CrossRef]
  52. Singh,S., et al. "Role of Metaverse in Events Industry: An Empirical Approach." In New Technologies in Virtual and Hybrid Events, edited by Sharad Kumar Kulshreshtha and Craig Webster, 165-184. Hershey, PA: IGI Global, 2024.
  53. Mencarini,E.;Rapp,A.;Colley,A;Daiber,F.;Jones,M.D.;Kosmalla,F.;Lukosch,S.;Niess,J.;Niforatos,E.; Wozniak,P.W.;Zancanaro,M.;(2022). New Trends in HCI and Sports. Association for Computing Machinery.
  54. Chen,S.S.;Zhang,J.J.(2024). Market demand for metaverse-based sporting events: a mixed-methods approach. Sport Management Review, 28(1), 121–147. [CrossRef]
  55. Dolğun,O.C.;Gökören,V;Hakan,G.;Halime, D.;Argan, M.(2024) “Immersion In Metaverse Event Experience: A Grounded Theory”. Sportive 7, sy. 2 288-307.
  56. Kumari,V.;Bala,P.K.;Chakraborty,S. (2024). A text mining approach to explore factors influencing consumer intention to use metaverse platform services: Insights from online customer reviews. Journal of Retailing and Consumer Services. Vol 81. [CrossRef]
  57. Garcia,M;Quejado,C.K;Maranan,C.R.B.; Ualat,O;Adao;R. 2024. Valentine’s Day in the Metaverse: Examining School Event Celebrations in Virtual Worlds Using an Appreciative Inquiry Approach. In Proceedings of the 2024 8th International Conference on Education and Multimedia Technology (ICEMT ’24). Association for Computing Machinery, New York, NY, USA, 22–29.
  58. Ramadan, Z. (2023) Marketing in the metaverse era: toward an integrative channel approach. Virtual Reality, Vol. 27, 1905–1918.
  59. Sahdev,S.L.et al. "An Analysis on the Future Usage of Metaverse in the Marketing Event Industry in India: An ISM Approach." In New Technologies in Virtual and Hybrid Events, edited by Sharad Kumar Kulshreshtha and Craig Webster, 147-164. Hershey, PA: IGI Global, 2024.
  60. Flores-Castañeda,R.O.;Olaya-Cotera,S.;Iparraguirre-Villanueva,O. (2024). Benefits of Metaverse Application in Education: A Systematic Review. International Journal of Engineering Pedagogy (iJEP), 14(1), pp. 61–81. [CrossRef]
  61. Nleya,S.M.;Velempini,M. (2024) Industrial Metaverse: A Comprehensive Review, Environmental Impact, and Challenges. Appl. Sci., 14, 5736.
  62. Dhanda,M.;Rogers,B.A.; Hall,S.; Dekoninck,E.; Dhokia,V.; Reviewing human-robot collaboration in manufacturing:Opportunities and challenges in the context of industry 5.0,Robotics and Computer-Integrated Manufacturing, Volume 93,2025,102937,ISSN 0736-5845, .
  63. Li, Z.;Jia, L. (2025). Bridging the Virtual and the Real: The Impact of Metaverse Sports Event Characteristics on Event Marketing Communication Effectiveness. Communication & Sport, 0(0).
  64. Kumar, A. "The Metaverse Touchdown Revolutionizing Sporting Events." Internationalization of Sport Events Through Branding Opportunities, edited by Jaskirat Singh Rai, et al., IGI Global, 2025, pp. 171-194.
  65. Singh,J.;Singh,R.;Singh,J.(2025) "The Sport Metaverse: A Deep Dive into the Transformation of Sports Events." In Internationalization of Sport Events Through Branding Opportunities, edited by Jaskirat Singh Rai, Maher N. Itani, and Amandeep Singh, 103-118. Hershey, PA: IGI Global.
  66. Ingale,A.K., et al. "Technological Advances for Digital Twins in the Metaverse for Sustainable Healthcare: A Rapid Review." In Digital Twins for Sustainable Healthcare in the Metaverse, edited by Rajmohan R., et al., 77-106. Hershey, PA: IGI Global, 2025.
  67. Chaddad A.;Jiang Y.(2025) Integrating Technologies in the Metaverse for Enhanced Healthcare and Medical Education // IEEE Transactions on Learning Technologies.Vol. 18. pp. 216-229.
  68. A. Nechesov, I. Dorokhov and J. Ruponen, "Virtual Cities: From Digital Twins to Autonomous AI Societies," in IEEE Access, vol. 13, pp. 13866-13903, 2025, .
  69. Pajani,M.;Safavi,A.;Shakibamanesh,A.;Adibhesami,M.A.;Sepehri,B. (2025). Augmented reality and digital placemaking: enhancing quality of life in urban public spaces. Journal of Urbanism: International Research on Placemaking and Urban Sustainability, 1–24.
  70. Sahraoui,Y.;Kerrache, C.A. (2025). Sensors and Metaverse-Digital Twin Integration: A Path for Sustainable Smarter Cities. In: Kerrache, C.A., Sahraoui, Y., Calafate, C.T., Vegni, A.M. (eds) Mobile Crowdsensing and Remote Sensing in Smart Cities. Internet of Things. Springer, Cham.
  71. Argota Sánchez-Vaquerizo, J. Urban Digital Twins and metaverses towards city multiplicities: uniting or dividing urban experiences?. Ethics Inf Technol 27, 4 (2025).
Figure 1. BMW engineers testing their digital twin in Regensburg (Germany).
Figure 1. BMW engineers testing their digital twin in Regensburg (Germany).
Preprints 154313 g001
Figure 2. Poster announcing Lil Nas X’s concert in Roblox.
Figure 2. Poster announcing Lil Nas X’s concert in Roblox.
Preprints 154313 g002
Figure 3. Detail of the event celebrating the 50th Anniversary of the Industrial School of Alcoy at the Polytechnic University of Valencia.
Figure 3. Detail of the event celebrating the 50th Anniversary of the Industrial School of Alcoy at the Polytechnic University of Valencia.
Preprints 154313 g003
Figure 4. Detail of the methodology used to create the digital twin of the Alcoy Campus.
Figure 4. Detail of the methodology used to create the digital twin of the Alcoy Campus.
Preprints 154313 g004
Figure 5. Logo design: black and white version: negative, positive. Color version.
Figure 5. Logo design: black and white version: negative, positive. Color version.
Preprints 154313 g005
Figure 6. Sequence of the transformation of the logo from the UPV coat of arms.
Figure 6. Sequence of the transformation of the logo from the UPV coat of arms.
Preprints 154313 g006
Figure 7. Installation of structures using aerial platforms. Plaza of the Alcoy campus of the Polytechnic University of Valencia.
Figure 7. Installation of structures using aerial platforms. Plaza of the Alcoy campus of the Polytechnic University of Valencia.
Preprints 154313 g007
Figure 8. Lighting of graphic elements. Plaza of the Alcoy campus of the Universitat Politècnica de València.
Figure 8. Lighting of graphic elements. Plaza of the Alcoy campus of the Universitat Politècnica de València.
Preprints 154313 g008
Figure 9. Detail of the integration of the photomontage of the facade with the CAD floor plans.
Figure 9. Detail of the integration of the photomontage of the facade with the CAD floor plans.
Preprints 154313 g009
Figure 10. Result of the 3D modeling of the Ferrándiz building.
Figure 10. Result of the 3D modeling of the Ferrándiz building.
Preprints 154313 g010
Figure 11. Examples of photographs taken of the decorative elements of the square.
Figure 11. Examples of photographs taken of the decorative elements of the square.
Preprints 154313 g011
Figure 12. Result of the modeling of the Carbonell Building, along with the decorative elements
Figure 12. Result of the modeling of the Carbonell Building, along with the decorative elements
Preprints 154313 g012
Figure 13. Detail of the cropped images to texture the 3D models.
Figure 13. Detail of the cropped images to texture the 3D models.
Preprints 154313 g013
Figure 14. Diagram of maps and materials used: Standard (Legacy) and Multi/Sub-Object.
Figure 14. Diagram of maps and materials used: Standard (Legacy) and Multi/Sub-Object.
Preprints 154313 g014
Figure 15. Final view of the campus modeling. The two boundary planes, the zeppelin, and the spaces for the 50th anniversary banners can be distinguished.
Figure 15. Final view of the campus modeling. The two boundary planes, the zeppelin, and the spaces for the 50th anniversary banners can be distinguished.
Preprints 154313 g015
Figure 16. Detail of the Quest component. It has been configured to start the missions automatically when the user enters the scene, to play them in the correct order, and a celebration has been set up upon completing the missions (Celebrate On Complete).
Figure 16. Detail of the Quest component. It has been configured to start the missions automatically when the user enters the scene, to play them in the correct order, and a celebration has been set up upon completing the missions (Celebrate On Complete).
Preprints 154313 g016
Figure 17. Trigger collision element located inside the illuminated cylinder. When the avatar enters the cylinder, the trigger activates the animation of the display of the commemorative banners.
Figure 17. Trigger collision element located inside the illuminated cylinder. When the avatar enters the cylinder, the trigger activates the animation of the display of the commemorative banners.
Preprints 154313 g017
Figure 18. Detail of one of the Trigger Events located at one of the doors of the Carbonell building. Once the objective of discovering the 3 doors has been completed, the final mission is unlocked.
Figure 18. Detail of one of the Trigger Events located at one of the doors of the Carbonell building. Once the objective of discovering the 3 doors has been completed, the final mission is unlocked.
Preprints 154313 g018
Figure 19. Detail of the stage with the lectern and the rear screen. The user must get on stage to complete the mission.
Figure 19. Detail of the stage with the lectern and the rear screen. The user must get on stage to complete the mission.
Preprints 154313 g019
Figure 20. Final result of the scene in Unity, with all components inserted. In the planter, the Seat Hotspot component can be seen, represented by an armchair.
Figure 20. Final result of the scene in Unity, with all components inserted. In the planter, the Seat Hotspot component can be seen, represented by an armchair.
Preprints 154313 g020
Figure 21. Spatial’s Creator Toolkit menu in Unity, with the configuration sections and the publish icon.
Figure 21. Spatial’s Creator Toolkit menu in Unity, with the configuration sections and the publish icon.
Preprints 154313 g021
Figure 22. View of the avatar’s starting point on the stage. You can see the detail of the zeppelin with the 50th anniversary logo, as well as the illuminated cylinder that activates the appearance of the flags.
Figure 22. View of the avatar’s starting point on the stage. You can see the detail of the zeppelin with the 50th anniversary logo, as well as the illuminated cylinder that activates the appearance of the flags.
Preprints 154313 g022
Figure 23. Detail of the online content creation menu in Spatial, where we can select our Empty Frame.
Figure 23. Detail of the online content creation menu in Spatial, where we can select our Empty Frame.
Preprints 154313 g023
Figure 24. The equation that defines the SUS system
Figure 24. The equation that defines the SUS system
Preprints 154313 g024
Figure 25. Design Degree students testing the designed application.
Figure 25. Design Degree students testing the designed application.
Preprints 154313 g025
Figure 26. View of the digital twin from the door of the Carbonell Building
Figure 26. View of the digital twin from the door of the Carbonell Building
Preprints 154313 g026
Figure 27. Detail of how the avatar can sit and contemplate the entire square.
Figure 27. Detail of how the avatar can sit and contemplate the entire square.
Preprints 154313 g027
Figure 28. Moment when the avatar completes the first mission and the banners are deployed.
Figure 28. Moment when the avatar completes the first mission and the banners are deployed.
Preprints 154313 g028
Figure 29. Detail of the celebration when the avatar completes the 3 missions.
Figure 29. Detail of the celebration when the avatar completes the 3 missions.
Preprints 154313 g029
Figure 30. Results of the Igroup Presence Questionnaire. In the left graph, the obtained data are represented with red dots, the blue bars represent the mean, and the standard deviation is represented with vertical black lines. In the graph on the right, the results of the subscales, their mean, and their standard deviation are represented.
Figure 30. Results of the Igroup Presence Questionnaire. In the left graph, the obtained data are represented with red dots, the blue bars represent the mean, and the standard deviation is represented with vertical black lines. In the graph on the right, the results of the subscales, their mean, and their standard deviation are represented.
Preprints 154313 g030
Figure 31. Figure 30. Results obtained in the SUS questionnaire: data (red dots), mean (blue bars), and standard deviation (vertical black lines).
Figure 31. Figure 30. Results obtained in the SUS questionnaire: data (red dots), mean (blue bars), and standard deviation (vertical black lines).
Preprints 154313 g031
Figure 32. Students, teachers, and attendees testing and viewing the application.
Figure 32. Students, teachers, and attendees testing and viewing the application.
Preprints 154313 g032
Figure 33. Detail of the graphics introduced to help the user better understand the operation of the controls, to improve navigability.
Figure 33. Detail of the graphics introduced to help the user better understand the operation of the controls, to improve navigability.
Preprints 154313 g033
Figure 34. Appearance of Meta’s new AR glasses. You can see the size reduction compared to the Meta Quest 3.
Figure 34. Appearance of Meta’s new AR glasses. You can see the size reduction compared to the Meta Quest 3.
Preprints 154313 g034
Table 1. Description of the 14 IPQ issues.
Table 1. Description of the 14 IPQ issues.
Number Question
IPQ1 In the computer generated world I had a sense of “being there”.
IPQ2 Somehow I felt that the virtual world surrounded me.
IPQ3 I felt like I was just perceiving pictures.
IPQ4 I did not feel present in the virtual space.
IPQ5 I had a sense of acting in the virtual space, rather than operating something from outside.
IPQ6 I felt present in the virtual space.
IPQ7 How aware were you of the real world surrounding while navigating in the virtual world? (i.e., sounds, room temperature, other people, etc.)?
IPQ8 I was not aware of my real environment.
IPQ9 I still paid attention to the real environment.
IPQ10 I was completely captivated by the virtual world.
IPQ11 How real did the virtual environment seem to you?
IPQ12 How much did your experience in the virtual environment seem consistent with your real world experience?
IPQ13 How real did the virtual world seem to you?
IPQ14 The virtual world seemed more realistic than the real world.
Table 2. Description of the SUS test questions.
Table 2. Description of the SUS test questions.
Number Question
SUS1 I think that I would like to use this system frequently.
SUS2 I found the system unnecessarily complex.
SUS3 I thought the system was easy to use.
SUS4 I think that I would need the support of a technical person to be able to use this system.
SUS5 I found the various functions in this system were well integrated.
SUS6 I thought there was too much inconsistency in this system.
SUS7 I would imagine that most people would learn to use this system very quickly.
SUS8 I found the system very cumbersome to use.
SUS9 I felt very confident using the system.
SUS10 I needed to learn a lot of things before I could get going with this system.
Table 3. Questionnaire presented to the attendees of the 50th anniversary gala.
Table 3. Questionnaire presented to the attendees of the 50th anniversary gala.
Number Type Description of the question
1 Qualitative How has your navigation and interaction experience been?
2 Qualitative Did you find easy to complete the missions?
3 Qualitative Has the help from the guide and the ability to connect with other users been useful to you?
4 Qualitative Were you able to complete the missions without external help?
5 Qualitative What did you think of the application of the Digital Twin?
6 Quantitative Did you find the experience useful?
7 Quantitative Would you recommend the experience?
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated