Preprint
Review

This version is not peer-reviewed.

Emerging Technologies in Augmented Reality (AR) and Virtual Reality (VR) for Manufacturing Applications: A Comprehensive Review

A peer-reviewed article of this preprint also exists.

Submitted:

04 July 2025

Posted:

07 July 2025

You are already at the latest version

Abstract
As manufacturing processes evolve towards greater automation and efficiency, the integration of augmented reality (AR) and virtual reality (VR) technologies has emerged as a transformative approach that offers innovative solutions to various challenges in manufacturing applications. This comprehensive review explores the recent technological advancements and applications of AR and VR within the context of manufacturing. This review also encompasses the utilization of AR and VR technologies across different stages of the manufacturing process, including design, prototyping, assembly, training, maintenance, and quality control. Furthermore, this review highlights the recent developments in hardware and software components that have facilitated the adoption of AR and VR in manufacturing environments. This comprehensive literature review identifies the emerging technologies that are driving AR and VR technology toward technological maturity for implementing manufacturing applications. Finally, this review discusses the major difficulties in implementing AR and VR technologies in the manufacturing sectors.
Keywords: 
;  ;  ;  ;  

1. Introduction

Industry 4.0 is transforming the current industrial environment through digitizing production processes. This is accomplished by implementing full network integration in all phases of product life cycle management and real-time data exchange, moving the current industry toward digitally enabled smart factories [1]. Industry 4.0 is driven by nine fundamental technologies listed in Figure 1. Immersive technologies such as Augmented Reality (AR) and Virtual Reality (VR) are among those nine transformative technologies that are enabling digital transformation towards the factory of the future [2]. Virtual Reality (VR) is defined as “the use of real-time digital computers and other specialized hardware and software to construct a simulation of an alternate world or environment, which is believable as real or true by the users” [3]. Augmented Reality (AR) is a technique for “augmenting” the real world with digital things [4]. The characteristics of AR systems as being able to geometrically match the virtual and real world and run interactively in real-time. AR must have three main characteristics: combining real and virtual worlds, real-time interaction, and registration in the 3D environment [5]. The combination of a real environment scene and a virtual environment is known as Mixed Reality (MR). The term “Mixed Reality” refers to applications where “real world and virtual world items are exhibited simultaneously within a single display, that is, anywhere between the extrema of the virtuality continuum [4]. While Augmented Reality (AR) takes place in a physical setting with virtual information added, Mixed Reality (MR) combines physical reality with virtual reality, allowing interaction between the physical and virtual world [4]. For VR systems, the real and digital worlds are blended to fully immerse the user in a computer-generated environment. This includes a fully immersive display, often in the form of a headset, sensors to track user movement and orientation, controllers, or gloves for user interaction within the virtual environment, and a computational unit, often a PC or a dedicated system which renders the virtual world in real-time [6,7]. The Reality-Virtuality (RV) Continuum is depicted in Figure 2 which illustrates the distinction between AR, VR, and MR [4].
In recent years, there has been a substantial and rapid evolution of Augmented Reality (AR) and Virtual Reality (VR) technologies. These advancements have been recognized for their potential to enhance learning methodologies and improve task execution, particularly within contexts characterized by dynamic and adaptable requirements [8]. AR and VR are leveraging the advantages of state-of-the-art technologies, like the Industrial Internet of Things (IIoT), Cyber-Physical Systems (CPS), and cloud computing making manufacturing processes smarter and more efficient [9]. People can now interact with the manufacturing process efficiently, reducing the manufacturing processes’ downtime and leading to positive economic growth [10].
The rapid evolution of manufacturing processes towards automation and efficiency, alongside the increasing integration of AR and VR technologies, serves as the primary motivation for this comprehensive review. Despite growing recognition of the potential benefits of AR and VR in manufacturing, there exists a notable research gap in synthesizing and critically evaluating recent technological advancements and applications within this context. This review aims to address this gap by systematically examining the current applications, state-of-the-art technologies, and challenges hindering widespread adaptation of AR and VR in manufacturing. By offering insights into emerging trends, this review aims to contribute to the advancement of AR and VR technologies in manufacturing applications.
This paper is organized into four sections. The proposal is introduced in Section 1. Section 2 explains the methodology for this literature review. Section 3 discusses the findings of the literature review by answering the Research Questions (RQ). It highlights the detailed applications of AR and VR in the manufacturing industry, summarizes the key findings and emerging technologies providing the statistical analysis to find the research trends, and the challenges of implementing AR and VR technologies in manufacturing applications. In Section 4 the literature review is concluded.

2. Review Methodology

This paper utilizes a systematic approach to focus on cutting-edge research regarding AR and VR applications in manufacturing, spanning the period of 2015 to 2022. The primary goal was to collect key insights from relevant papers found in peer-reviewed journals and conferences. Through a thorough evaluation, the study aims to identify challenges in various aspects of AR/VR implementations for manufacturing and identify research trends. To conduct this literature review, the following steps were undertaken: planning the review methodology, defining the scope, conducting article searches, evaluating the identified articles, synthesizing relevant information, and analyzing the results [11]. The steps of the review methodology are illustrated in Figure 3.

2.1. Planning

The initial step involved formulating research questions to guide the research methodology, with a focus on identifying current technological trends and challenges of AR/VR technologies for manufacturing applications. The research questions are outlined in Table 1. Subsequently, the Web of Science (www.webofscience.com) database and Google Scholar (www.scholar.google.com) search platform were used for gathering the peer-reviewed research articles as they are recognized for their reliability in providing peer-reviewed research articles.

2.2. Article Search

To collect research articles, a combination of search strings was used which are listed in Table 2. For the initial search, the research articles that were newer than 2014 were included. Studies older than 2015 have not been included as these studies lose their relevance with the current technological advancement of AR/VR technologies. The statistics of the collected articles are listed in Table 3. Only articles in English were included and commercial and repeated studies were excluded.

2.3. Initial Screening

In this step, titles and abstracts were screened to find the relevant papers that support the research questions. Articles related only to the AR/VR manufacturing applications were included. Relevant papers were also gathered by screening the references of the collected articles.

2.4. Quality Screening

To check the quality of the paper, the introduction, methodology, results, discussion, and conclusion of the collected papers were looked at to ensure the quality of the research and determine relevant research articles. The goal was to have a wide range of papers that would cover key topics of AR/VR. This would ensure that the information extracted from these articles would give an overview of all aspects of manufacturing.

2.5. Data Extraction

In this step, key elements from the research articles were extracted, and stored for further quantitative and qualitative analysis of the research.

2.6. Reporting

In this final stage, the extracted data was reviewed and reported in Section 3. Different categories of papers and specific data were analyzed to get a detailed overview of the AR/VR-based manufacturing applications.

3. Results and Discussion

This section highlights key findings and insights derived from the reviewed literature. After a careful review of the collected articles, the answers to the research questions have been compiled.

3.1. RQ1: What Are the Current Applications of AR/VR in Manufacturing?

To begin addressing Research Question 1 (RQ1) on the current applications of AR/VR in manufacturing, the literature review identifies the common applications where AR/VR is used within the manufacturing sector.

3.1.1. Maintenance Applications

Maintenance, a critical facet of manufacturing, is important for keeping production lines working smoothly and keeping product quality. A large portion of research focuses on this area as it can increase costs significantly and impact safety [12]. AR and VR have emerged as potent tools in the domain of maintenance, offering substantial potential for cost reduction by streamlining troubleshooting procedures, expediting access to documentation, and pinpointing malfunction locations. A study documented in [2] underscores the advantages of graphical representations over text-based information, as they facilitate quicker and more effective information processing. Using Augmented Reality (AR) instructions for maintenance has been proven to be effective, with real-world evidence showing it can cut maintenance time for industrial machinery by up to 30% [10]. An AR-based maintenance operation in which symbols/arrows are used to show screw locations during repair is depicted in Figure 4. Many of the papers found in maintenance applications showed the use of HMDs as they allowed both hands to be free during repair. Some applications, however, use HHD for some troubleshooting and diagnostic scenarios. This could involve looking into machines or walls where equipment and parts aren’t necessarily visible.

3.1.2. Assembly Applications

AR/VR in the field of assembly applications is one of the most widely used. Studies show that AR/VR solutions, when compared to paper-based instructions, are superior in assembly tasks, demonstrating a decrease in both times to complete and error rate [13]. Moreover, they exhibit a larger decrease in these aspects as the tasks become more complex. Participants in the study who were not experienced in analyzing complex isometric drawings were able to understand them for their assembly task. The learning curve also steepened compared to paper-based instructions showing a decrease in time for users to become familiar with a reoccurring task. Studies also showed that when comparing experts and untrained workers with assembly tasks, both categories of workers increased in their efficiency and decreased in errors. An example of an AR-based assembly application is shown in Figure 5.

3.1.3. Operations Applications

Shop floor operators show a lot of movement on the floor of a manufacturing plant, moving to different tasks, to gather materials, or to get information. Operation applications focus on decreasing the time of movement of operators around the plant and customizing routes to be the most efficient [2]. They also describe the planning method in manufacturing as well as the setup of machines during production change [14]. The use of a projection-based (spatial AR) system shown in Figure 6 is implemented in a die-cutting station to digitize and display correction templates on the matrix. Originally each product die cutter combination would require a unique correction template, meaning a large physical storage of collection templates. The proposed solution eliminates this by digitizing the correction templates and projecting them onto the matrix. The operator now only has to look up the template on the computer and display it.
All of the forms of visualization have been seen in operations scenarios. This is due to the extent of varying environments and work that operators have to do. They can certainly be specialized by the operation being done.

3.1.4. Training Applications

Training systems play a pivotal role in modern workplaces, alleviating the time constraints faced by trainers and enhancing the overall learning experience. AR and VR have gained prominence in these contexts due to their ability to engage users effectively, thereby amplifying learning performance and motivation [15]. It is essential to note that the term ‘training’ encompasses the concepts discussed in the preceding sections, all of which contribute to the comprehensive training of new employees. Training applications not only aim to familiarize users with AR/VR interfaces but also strive to foster self-sufficiency to the point where AR/VR becomes unnecessary. Additionally, these applications facilitate the seamless integration of tasks regardless of the worker’s skill level. In manufacturing industries, where onboarding and training processes for new employees can span several weeks, such systems become indispensable. By reducing the time required by already trained employees, they allow these individuals to focus on their core responsibilities.

3.1.5. Product Design Applications

Product design and prototyping are integral parts of manufacturing processes. The success of any design project depends on effective prototyping [16,17]. Designers have been predominantly using Computer-Aided Design (CAD) tools since the 1990s for the design, development, prototyping, and simulation of new products [18]. With the technological maturity of Augmented Reality (AR) and Virtual Reality (VR) hardware and software, designers are adopting AR/VR technologies as extensions of the CAD tools for better visualization and realization of the product design. Using these technologies product designs are overlaid onto the physical setup or users can interact with the product design and assemblies in an immersive environment using AR/VR devices making the design process more realistic and collaborative. AR/VR can be incorporated into every stage of the product design lifecycle [18] (Figure 7). AR/VR can effectively be used for remote design collaboration by bringing different remote users into one immersive platform. In one study, authors utilized a model-based definition approach and AR to develop an industrial product inspection using 3D CAD software and product manufacturing information (PMI) data transfer methods [19]. Despite the potential significant benefits of utilizing AR/VR in design applications, AR/VR technologies are not adopted in a wide range of design applications. The authors proposed a Double Diamond design model to address this issue by utilizing systematic AR use case identification, criteria catalog for appropriate AR device selection, and recommendation of modular software architecture for efficient customization and data integration within enterprise IT frameworks [20].

3.1.6. Quality Control Applications

Continuous monitoring of the production process and the evaluation of product quality is an essential part of the manufacturing process. Quality control helps manufacturers identify and address quality-based issues in real-time and helps to enhance the consistency of the process, reduce product defects, and improve overall production efficiency. Computer vision technology, sensor-based defect detection, and non-destructive testing are general technologies that are predominantly used for defect detection. AR and VR technologies are seen to be emerging in the area of quality control and visualization. Several research studies have been conducted to incorporate AR and VR technologies into manufacturing. The authors of [21] identified the three main areas of AR/VR application, depicted in Figure 8, for the quality control in manufacturing: immersive lean tool, metrological inspection, and in-line quality assessment. The authors of [22] utilized 3DS MAX and Unreal Engine 4 to develop a virtual simulation shown in Figure 9 for teaching machine learning-based beer bottle defect detection, encompassing bottom, mouth, and body defects and the system was evaluated with traditional experiments, 40 junior students preferred the immersive and interactive virtual experiment, highlighting its effectiveness in the learning process. In one study, researchers employed AR technology to real-time monitor polished surface quality in a SYMPLEXITY robotic cell with an ABB robot, visualizing metrological data onto the parts to time to check if the quality reached the required specifications [23]. Some AR/VR-based quality control applications in manufacturing are listed in Table 4.
In summary, the major applications of AR/VR in manufacturing encompass maintenance, assembly, operations, training, product design, and quality control, each demonstrating substantial benefits in improving efficiency, reducing errors, and enhancing overall productivity. However, the potential of these technologies extends beyond these core areas to include inventory management, supply chain optimization, customer engagement, and more.

3.2. RQ2: What Are the State-of-the-Art Technologies Used in AR/VR Applications in Manufacturing?

Addressing Research Question 2 (RQ2) regarding the state-of-the-art technologies used in AR/VR applications within manufacturing requires a comprehensive exploration of cutting-edge tools and platforms utilized in the industry. This investigation delves into key components, hardware, and software that are vital for the implementation of AR/VR technologies across various manufacturing processes.
There are several aspects of AR/VR applications and devices that have been recognized by most papers to be necessary for creating a system that incorporates smart manufacturing paradigms. Several reoccurring ideal features depicted in Figure 10 can be seen for AR/VR systems: interoperability, virtualization, decentralization, real-time capabilities, and modularity [24,25]. Interoperability is one of the largest contributors to smart manufacturing, allowing communication between the user and the machines in the manufacturing environment through the Internet of Things (IoT) and other industry standards. Standardization of these devices is also crucial for the system to be robust throughout multiple scenes [10]. Virtualization is necessary for calibrating devices and creating the digital environment and simulation models using Cyber-Physical Systems (CPS). CPS can use this information to monitor processes that are happening in real-time and notify the user of changes in the scene. Decentralization enables easy access to diverse work instructions, providing users with specific information and procedures without manual searching, saving valuable time [24,25]. Real-time capabilities are especially important as it is crucial for the users to have up-to-date information on how the status of machines and operations are changing as well as updating technical machine data to keep plant records current. This also applies to the use of the application as tracking technologies are needed to process data in real-time. Finally, the modularity of the application keeps it flexible to changes in the environment and allows the system to adapt to specific scenarios in the scene. This means that documentation on new processes and procedures needs to be up to date and modularized to specific use cases [24].
There are certain key components of AR/VR technologies that are required for the implementation of AR/VR-based applications in manufacturing. In this section, the key components specifically hardware, and software utilized in AR/VR-based manufacturing applications are reviewed and discussed at length.

3.2.1. Hardware Used in AR Applications

The components of an AR hardware system include tracking devices, haptic and force feedback rendering devices, displays, and sensors [26]. AR display/visualization devices can be divided into 3 types: Head Mounted Device (HMD), Handheld Display (HHD), and Spatial Display (SD).
An HMD is a display device that is worn on the head or as a component of a helmet, and it has a display optic in front of one (monocular HMD) or both eyes (binocular HMD). In AR applications, HMDs are frequently used because the eye-level display enables users to experience the AR scene hands-free. HMD devices are the preferred choice for AR applications in manufacturing. They offer hands-free operation which enables the workers to interact with manufacturing systems and access virtual overlayed information. Additionally, for AR systems HMDs overlay real-time data directly onto the physical world optimizing the efficiency and accuracy of the manufacturing operation. There are many commercially available AR visualization devices in the market.
Handheld Display (HHD) enables users to visualize the AR environment when the display is held in reference to a specific environment [27]. These can include tablets and cellphones that have screens and cameras available to receive and display data.
Spatial Displays are mostly projection-based displays that do not require users to wear head-mounted displays and instead project information into the actual surroundings [26]. Spatial Displays in manufacturing can project assembly instructions and quality inspection details directly onto parts. Additionally, they enhance training and safety by overlaying crucial data and guidelines directly onto the shop floor or machinery [28].

3.2.2. Tracking Systems for AR Applications

In manufacturing applications, various tracking systems are employed, including marker-based, model-based, and feature-based tracking. Among these, marker-based tracking has widespread popularity due to its robustness and ease of setup. Model and feature-based tracking methods are also employed, often in conjunction with marker-based tracking. These tracking systems enable the augmented reality (AR) headset to precisely determine the location of objects within the environment and establish the spatial positioning of the AR system itself [10]. Notably, these tracking systems can be integrated in a hybrid fashion, allowing for more versatile and accurate tracking solutions.
Marker-based tracking involves the utilization of external visual aids, such as ArUco markers or Infrared (IR) markers, to precisely determine the spatial position of objects. However, these tracking systems come with distinct advantages and disadvantages. Marker-based tracking stands out as a highly reliable solution due to its minimal external processing requirements, resulting in low-latency performance. Additionally, these markers remain stationary, mitigating the need for the AR application to rely on other reference points or external data sources. For marker-based tracking to function effectively, the markers must remain within the field of view of the camera, ensuring the viability and accuracy of the tracking system [29].
Model-based tracking utilizes CAD models to locate components in the environment, leveraging prior knowledge of object shapes and appearances for AR interpretation. By placing markers on CAD models during setup, the system can accurately track and display virtual information, and when combined with digital twins, it provides a highly accurate representation of the work environment. [2]. These models can be used with a deep learning program to evaluate different forms of the object which is especially useful when interacting with objects that move like robots.
Feature-based tracking brings objects into the digital domain by considering their points, using the object’s features as markers, similar to marker-based tracking. It relies on a point cloud model combined with CAD models to identify and track objects, providing a way to determine an object’s identity and follow its movements [30].

3.2.3. Hardware Used in VR Applications

Virtual Reality (VR) applications require a range of hardware to deliver immersive experiences. Various types of visualization displays are used for VR applications such as HMDs like the Meta Quest, HHDs like Tablets or PCs, and some Personal Computers are also used for simulation and digital twin applications. Different tracking sensors such as camera, motion, and gaze are used to track the headset, controllers, and human actions. Haptic systems are also an important part of the VR hardware system. Input devices range from motion controllers to eye-tracking modules. Positional tracking encompasses sensors and treadmills, while haptic feedback offers tactile sensations. Lastly, the VR experience can be powered by VR-ready PCs or wearable backpack computers, complemented by spatial audio hardware for sound immersion [26,31].

3.2.4. Software Used in AR/VR Applications

The field of AR/VR is continuously evolving with new software tools and platforms. Game engines are often used to develop AR/VR applications as they use extensive computation to overlay virtually generated information into 3D spaces. Unity 3D and Unreal engine are two of the most popular game engines for developing AR/VR applications [32]. Among those two, Unity 3D is mostly adopted by developers to build AR/VR applications due to its cross-platform capabilities. Along with these game engines developers also use different AR/VR libraries for the application development. AR libraries such as Vuforia, ARToolKit, ARCore, ARKit, Mixed Reality Tool Kit (MRTK), and Wikitude are widely implemented. Vuforia is one of the widely used AR libraries for AR application development, ARKit is used for iOS-based AR application development, and ARCore is used for developing AR applications on Android devices. Wikitude is a cross-platform AR library. The MRTK is necessary for the development of AR applications using HoloLens. It has also become widely used in the field as recently it has become an open source for other glass types to use. For VR development there are different Software Development Kits (SDKs) available in the market. OpenVR is an SDK that enables programs to use different VR devices without needing to know the specifics of VR devices [33]. Other popular VR SDKs are Oculus SDK, SteamVR SDK, Google VR SDK [34]. Developers use VR platforms such as Steam VR, Oculus Home, and Windows Mixed Reality which offer comprehensive ecosystems to share VR content. Developers leverage these technologies to develop interactive, easy-to-share VR applications for manufacturing applications [35].

3.2.5. Overview of Hardware and Software Used in AR/ VR Applications in Manufacturing

The review articles have been scrutinized to get the trend of software and hardware used in AR/VR applications is illustrated in Table 5. The table offers a snapshot of various AR and VR applications, highlighting the diverse range of hardware and software technologies employed across different research groups.
Unity and Vuforia SDK are the most commonly used software platforms in the reviewed AR papers. Marker-based and feature-based tracking methods are mostly used, with Vuforia and ARToolkit frequently utilized for marker-based tracking. HoloLens appear as the most popular AR glasses.
For VR development, Unity is the predominant software platform for VR application development. Hardware-wise, HTC Vive and Oculus HMDs are prominently used in VR visualization. These headsets utilize advanced tracking technologies like SteamVR and Tobii eye-tracking.
Table 6. Communication Technology used in AR/VR-based manufacturing using edge devices.
Table 6. Communication Technology used in AR/VR-based manufacturing using edge devices.
Research Group Application Communication Technology
[132] Machining operation Wireless TCP/IP
[37] Human-robot collaboration Modbus TCP
[105] Digital twin Wireless TCP/IP
[29] Human-robot collaboration ROSbridge (websocket)
[36] Machine operation TCP/IP Protocol
[63] Machine operation Websocket
[78] Maintenance application WebRTC protocol
[133] SCADA system Wireless TCP/IP
[70] Maintenance application Ultra-Wide Band (UWB)
[134] Human-robot collaboration ROSbridge (wifi)
[23] Quality assessment operation Wireless TCP/IP

3.3. RQ3: What Are the Emerging Technologies in the Field of AR/VR Applications in Manufacturing?

This research question aims to identify and explore the emerging technologies that are used in AR/VR applications for manufacturing. By investigating these advancements, the objective is to identify the technological trends of current manufacturing practices through the integration of AR/VR technologies.
During the initial phase of AR/VR technology development, manufacturing applications were primarily image or video-based, lacking interactive capabilities and closed-loop mechanisms, thus providing limited feedback from user interactions. However, with substantial research in the AR/VR domain, advancements in key technologies have propelled these applications towards increased interactivity and closed-loop features, significantly enhancing the overall user experience and system responsiveness.
After a careful review of the articles, emerging technologies that are driving AR/VR technology towards technological maturity for implementing closed-loop manufacturing applications are identified. Artificial Intelligence (AI), Digital Twin, Teleportation, and Edge Technologies are among those emerging technologies that are listed in Figure 11. Human interaction is a vital part of any closed-loop manufacturing system. Due to the advancement of industrial robotics, interaction with the system is becoming more and more challenging. For this reason, human-robot collaboration is becoming one of the emerging fields of research, and AR/VR becoming an integral part of applications involving human-robot collaboration. In the subsequent sections, the key findings of these technological trends and their applications in the AR/VR domain are discussed in detail while proposing generalized architectures for the implementation of these technologies in the aspect of manufacturing industries.

3.3.1. Edge Applications

AR/VR technologies play a vital role in manufacturing process monitoring, providing real-time and interactive process information, quality control, maintenance, and training applications. One of the key technologies that are making this possible is the Internet of Things (IoT) and in a broad sense, edge devices. AR/VR in conjunction with the Internet of Things (IoT), machine diagnostics are being overlayed onto the machine and production environment making manufacturing operations and troubleshooting more interactive and efficient. Leveraging edge devices, AR and VR have significantly enhanced the manufacturing processes by getting the data from the physical systems in real-time and creating a deeper and more immersive experience. Getting the physical systems data such as manufacturing data from PLC, robot controller’s data, wireless sensors, and wired sensors data which are local as well as distributed needs to be transferred from the equipment layer in real-time to the edge layer. The edge layer processes the data using edge devices and central processing servers to transfer it to the immersive visualization layer for AR/VR-based visualization and interaction. In one study, authors demonstrated an AR-based alert and early warning system for real-time machine health monitoring which was deployed on Android devices and Microsoft HoloLens, and data was collected from the physical system using TCP/IP protocol [36]. The solution intends to notify the operators via notifications whenever an alarm arises on a machine and provide alarm information.
AR/VR with edge intelligence can also be used for monitoring the machining operation. The authors of [37] proposed a VR system with a robot, replacing the traditional human-machine interface (HMI) console to visualize real-time robot trajectories for predicting errors and accident prevention and replay previously executed trajectories from a database for post-operation error analysis. Here the data has been transferred using the Modbus TCP protocol. Communication protocols are a vital part of implementing AR/VR-based manufacturing applications that use edge devices. Communication protocols vary based on the use cases and the hardware used for the applications. Error! Reference source not found. lists some communication technologies used in AR/VR-based applications using edge devices in manufacturing. The proposed generalized architecture for implementing AR/VR applications using edge devices is shown in Figure 12. The architecture can be more complex based on the use case.

3.3.2. AI-Based Applications

AR/ Manufacturers are increasingly utilizing Artificial Intelligence (AI) and Machine Learning (ML) techniques to enhance their processes with the goal of improving overall operational efficiency. One study conducted comprehensive research regarding the applications of classical machine learning and deep learning methods and the authors identified that there is a growing trend of implementing machine learning and AI algorithms in manufacturing operations [38]. AI joined with AR/VR in manufacturing can unlock advanced capabilities and future innovations. AI-based applications are one of the most promising fields of future research in the AR/VR domain. AI is predominantly used in image processing and computer vision, making it a potential tool for AR/VR systems as image recognition, object detection, and pose estimation are crucial parts of AR/VR systems. An immersive system with AI features involves several key steps to integrate real and virtual environments seamlessly. These steps start with calibrating the virtual camera with the real camera to ensure an accurate depiction of the real world. Then, dynamic objects are detected, and tracked in real-time, and the real camera’s pose estimation is done. This information is used to create and render virtual objects that are aligned with the real environment, enhancing the user’s AR experience [39]. Several research works have been performed to implement and improve AI-based AR/VR applications. The deep learning-based Convolutional Neural Network (CNN) is widely used for identifying objects in images. One of the prominent methods in this area is the Region-based CNN (R-CNN) [40,41,42]. The authors of [43] used multimodal AR instructions in conjunction with a tool detector built with a Faster R-CNN model trained on a synthetic tool dataset to increase worker performance during spindle motor assembly testing of a CNC carving machine. The system reduces assembly completion time and mistakes by 33.2% and 32.4% respectively compared to the manual assembly process. Many researchers also used YOLOv5, a state-of-the-art object detection model, for real-time and accurate object detection in AR/VR applications [44]. Pose estimation of industrial robots is vital in AR/VR manufacturing applications as it enables the precise alignment of virtual objects with real-world objects or locations and AI can play a vital role in implementing almost real-time post-estimation. In one study, authors proposed a markerless pose estimation method using CNN to reference AR devices and robot coordinate systems [45]. The architecture of the proposed system used both RGB-based and depth-image-based approaches which offer automatic spatial calibration.
Generative AI, one of the newest fields in AI, has the transformative potential in AR/VR applications in manufacturing by enhancing situational awareness and enabling more efficient maintenance and design processes [46,47]. By integrating large language models with augmented and virtual reality, manufacturers can provide real-time, context-aware assistance to workers, enabling hands-free operation and in situ knowledge access [48]. These technologies can support tele-maintenance and consultation, allowing experts to guide on-site technicians remotely with immersive, interactive experiences [49]. Furthermore, VR-enabled chatbots facilitate the customization of complex products by offering intelligent, natural language interactions that streamline the design and manufacturing processes [50].
Table 7 lists some AR/VR-assisted AI-based manufacturing applications. Selection of the algorithm, quality of data, computational speed and data transfer speed are the key aspects of AI-based manufacturing applications using AR/VR. The proposed generalized system architecture for implementing AI-based AR/VR-assisted manufacturing applications is shown in Figure 13.

3.3.3. Digital Twin Applications

Digital twin applications using AR and VR are one of the transformative tools in manufacturing for digital transformation providing smart decision support [51]. Different authors represented the definition of the term ‘digital twin’ in different ways. Authors of [51] provided a generalized and consolidated definition of ‘digital twin’ which is the most suitable definition for this literature review: “digital twin is a virtual representation of a physical system (and its associated environment and processes) that is updated through the exchange of information between the physical and virtual systems.” The main features of a digital twin application are illustrated in Figure 14 [51]. AR and VR are the key technologies to implement digital twin-based applications in manufacturing such as product design and development, simulation, training, machine and robot calibration, and fault diagnosis. Research has been performed to implement digital twin-based AR/VR-based applications in manufacturing. Authors of [52] proposed an AR-assisted robot robotic welding paths programming system. The authors overlaid the virtual robot on the actual robot and a handheld pointer was tracked using an Optitrack Flex 3 camera motion capture system to directly interact with the workspace and to guide the virtual robot’s movements and path definitions.
Table 8. Digital twin application in manufacturing using AR/VR.
Table 8. Digital twin application in manufacturing using AR/VR.
Research Group Application Type of Visualization
[52] Robot welding programming Immersive
[37] Simulation and training Immersive
[53] Human-robot collaboration Immersive
[182] Robot programming Immersive
[54] Human-robot collaboration Immersive
[113] Remote human-robot collaboration Immersive
[116] Process monitoring Non-immersive
[103] Design review application Immersive
[183] Robot programming Immersive
[135] Assembly operation Immersive
[61] Human-robot collaboration Immersive
[60] Remote human-robot collaboration Immersive
[138] human-robot collaboration Immersive
[181] Robot programming Immersive
[112] CNC milling machine operation Immersive
Traditional HMIs sometimes have limited capabilities in interacting with the physical world. By using digital twin features, AR/VR systems are becoming a preferred choice for replacing traditional HMIs. Considerations need to be made while choosing the level of virtualization for digital twin applications. Digital twin-based AR/VR systems can be immersive and non-immersive. Conventional non-immersive virtual systems engage sight and hearing through visualization of the computer screen while keeping users connected to the physical world. On the other side, immersive virtual systems isolate users from reality and facilitate interaction within the virtual environment using specialized haptic devices. There are different levels of virtualization in virtual systems shown in Figure 15 [53]. In one study, authors developed a VR system with a robot to replace the traditional HMI console to visualize real-time robot trajectories for predicting errors and preventing accidents. The system can replay previously executed trajectories from a database for post-operation error analysis [37].
VR systems were seen to be used more often in the sense of DT applications. They were mostly used off site where AR would be less useful. When being used for AR, they were used as a way of creating content for the application, less immersive. DTs were often used for training scenarios in a more immersive format. This gave trainees the option to simulate being on a shop floor without the dangers of being inexperienced with equipment.
In another study, a virtual robot model was overlaid on a physical robot system and users can interact with the system using Microsoft HoloLens. The Virtual layer converts the user-defined trajectories into robot commands and the Robot Operating System (ROS) layer processes these commands to communicate with the industrial robot controller [49].
Digital twin features can be used for remote operations also. The authors used HTC Vive headsets and the Unity plugin PUN2 to synchronize users and handle the network for multiuser communication. identifies some general manufacturing applications of AR/VR using digital twin technology. The proposed generalized system architecture for implementing digital twin-based AR/VR-assisted manufacturing applications is shown in Figure 16.

3.3.4. Teleportation and Remote Collaboration Applications

Remote collaboration enables the experts to share their knowledge and guide engineers, operators, and technicians with less domain knowledge to troubleshoot manufacturing operations. In addition, remote collaboration enables designers and product developers to collaborate from different locations which can significantly reduce the product or process development time. Traditionally remote collaboration is done using video and audio-based communication [55]. However, it has limited interaction capabilities. With the recent advancement of immersive technologies (AR and VR), remote collaboration and teleportation applications are becoming more interactive and realistic by providing haptic feedback and sensing capabilities. A growing amount of research has been done in the field of AR/VR-based teleportation and remote collaboration. The authors of [56] demonstrated the process of improving the effectiveness of teleoperated robotic arms for manipulation and grasping activities using Augmented Visual Cues (AVCs). A case study with 36 participants was conducted to validate the system’s usefulness, and the authors also employed NASA Raw-TLX (RTLX) to quantify task performance [57]. In another study, authors proposed a situation-dependent remote Augmented Reality (AR) collaboration system that offers two functionalities: image-based AR collaboration for limited network or device capabilities and live video-based AR collaboration with a synchronized VR system. For image-based collaboration, annotations are added directly onto 2D images, or a 3D perspective map is generated and for the synchronized VR mode annotations are attached to a virtual object to eliminate inconsistencies while changing the viewpoints during collaboration [58]. In remote collaboration, the remote users need to have dimensional knowledge about the location they are interacting with. Often the virtual environment of the physical system is developed to get knowledge about the physical system remotely. The development of the virtual system is challenging, and serval research studies have been done on developing virtual systems for teleportation and remote collaboration. Authors of [59] introduced a methodology for creating a point cloud-based virtual environment incorporating regional haptic constraints and dynamic guidance constraints for teleportation operation. The system integrates a virtual robot model with real-time point cloud data and haptic feedback, allowing operators to control the virtual robot using a haptic device. The authors of [55] proposed an AR-based multi-robot communication system using digital twins (DT) and reinforcement learning (RL) algorithm for teleportation operations. The digital twin of the industrial robots is mapped onto the physical robots and visualized using AR glasses and the RL algorithm is used for robot motion planning which replaces the traditional kinematics-based movement with target positions. Table 9 identifies some general manufacturing applications of AR/VR for teleportation and remote collaboration. The proposed generalized system architecture for implementing AR/VR-based teleportation applications in manufacturing is shown in Figure 17.
Teleportation applications were also more often found to be used in VR applications over AR. Due to the format being more immersive, allowing for the user to simulate being near equipment. This however does not mean that AR cannot be used for teleportation applications.

3.3.5. Human-Robot Collaboration Applications

In the development of any manufacturing application, human interaction is an integral component. While robotics and automation have significantly advanced manufacturing processes, human interaction remains an essential part of the successful operation of manufacturing operations. For this reason, human-robot collaboration has become a prominent field of research. AR and VR are among the leading technologies that have been used to make human-robot collaboration more interactive. In one study, authors proposed a multi-robot collaborative manufacturing system with AR assistance where humans use AR glasses to determine the position of the physical robot’s end-effector, and an RL-based motion planning algorithm calculates the joint value solution that is viable [60]. In another study, the authors proposed an HRC collaboration system using VR for human-robot simulation. The authors used Tecnomatix Process Simulate (TPS), Universal Robots, and an HTC Vive VR headset to perform an event-driven human-robot collaboration simulation [53].
The authors of [54] proposed an AR-based human-robot collaboration system with a closed compensation mechanism. The proposed system used four layers: the AR Layer, the Virtual Environment, the Robot Operating System (ROS), and the KUKA Robot Controller layer. Microsoft HoloLens was used to interact with users, allowing them to control a virtual robot model overlaid on an actual robot using gestures, and user-defined trajectories were converted into robot commands.
Authors of [61] leveraged a Mixed Reality (MR) system, deep learning, and digital twin technology to develop a Human-Robot Collaboration (HRC) system by calculating real-time minimum safety distances. The system also provides task assistance to users using Microsoft HoloLens 2. Table 10 identifies some general manufacturing applications of AR/VR for human-robot collaboration. The proposed generalized system architecture for implementing human-robot collaboration using AR/VR is shown in Figure 18.
The statistical analysis of research papers focusing on applications and emerging technologies is illustrated in Figure 19. The red scale within the figure represents applications, while the grey scale denotes the different technologies utilized. The findings reveal a notable surge in interest towards AR/VR applications for manufacturing across various domains in recent years.
Figure 20 presents detailed statistics on the utilization of AR/VR devices within the reviewed papers. It is observed that Head-Mounted Displays (HMDs) emerge as the preferred choice for immersive visualization among these devices. This preference may be attributed to the development of new Software Development Kits (SDKs), resulting in more accessible and stable application development compared to previous years. Additionally, the rise of emerging hardware has facilitated the creation of more affordable and user-friendly AR/VR applications.
Furthermore, Figure 21 illustrates the statistics on tracking technologies employed in AR/VR applications. Mark-based tracking technologies remain popular due to their ease of implementation. However, there has been a growing trend towards feature-based tracking over the years, driven by its ability to track immersive content without the need for physical markers. However, the 2023 to 2024 papers show little to no use of model-based tracking. This could be due to the newer ease of creating feature-based tracking models. More and more introductory technologies come with features of making these models without the need of external sources. Many smart phones come with lidar sensors built in. Most of the papers that use HHDs as their viewing device often use markered tracking methods. These are often preliminary applications and prevent the need for investing into expensive hardware.

3.4. RQ4: What Are the Challenges for the Adaptation of AR/VR Applications in Manufacturing?

There are many challenges in applying AR/VR applications in manufacturing environments. These challenges are categorized as technological challenges, organizational challenges, and environmental challenges.

3.4.1. Technological Challenges

In this section, different technological challenges of AR/VR technologies in manufacturing applications are discussed in detail.
  • Tracking and registration
Tracking and registration are the most vital of any AR/VR application. Real-time tracking is preferred for AR/VR-based manufacturing applications. Different tracking technologies are discussed in Section 3.2.2. Among those tracking technologies, marker-based tracking is the most widely used and easier choice for AR/VR-based manufacturing applications and for this reason, many researchers used marker-based tracking for manufacturing applications [62,63,64,65,66]. Marker-based tracking systems face challenges from maker occlusion making a part or entire marker not viewable using the tracking system [67]. As the manufacturing system is complex in nature marker occlusion can easily happen to any movable parts which need to be tracked. The alternate approach to solve this problem is the markerless tracking system. Simultaneous localization and mapping (SLAM) is commonly used in markerless tracking [26]. In one study, authors developed a quick development toolkit using SLAM and QR code for augmented reality visualization of a factory [68]. AI-based deep learning techniques can also be used for markerless tracking. However, there are many challenges in applying these technologies with immersive technologies for markerless tracking in the manufacturing environment. In the manufacturing environment, there are products with limited variations such as nuts and bolts of different sizes but with the same shapes. These products are harder to differentiate, making it challenging for object detection algorithms to detect the desired objects. Again, training these algorithms is sometimes resource-intensive and time-consuming which makes it harder to implement it in AR/VR systems [39]. Additionally, collecting data from the manufacturing process is challenging and there is also data privacy issue in manufacturing industries. Another challenging part of implementing these algorithms in AR/VR systems is real-time data transmission. Modular AR/VR devices are resource-constraint devices and have less computation power and in most cases cannot run AI algorithms. In this scenario, object detection algorithms need to run on the remote server and decisions have to be transferred to the AR/VR systems using network devices in real-time which is very difficult to implement [39]. The challenges of implementing markerless tracking with AR/VR systems in manufacturing are listed in Figure 22. Registration of the virtual component in the immersive environment depends on the successful tracking of the object [67]. The accuracy of the tracking and the correct alignment of virtual components in the real environment are challenging to implement in complex manufacturing settings.
  • Evaluation of AR/VR devices
There are many AR/VR devices available for implementing AR/VR applications but selecting the proper AR/VR device for a specific manufacturing application can be challenging. Selection of the AR/VR devices depend on various types of factors such as the price, battery life, desired field of view, optics technology, camera type, open API support, audio technology, sensor technologies used, haptic controls, processor, memory and storage, data connectivity options, operating system, ingress protection (IP rating) and appearance among other factors. There is no standardized benchmarking tool for the evaluation of AR/VR devices for manufacturing applications [69]. In one study, the authors developed a VR benchmark for assembly assessment but generalizing the benchmark for any manufacturing environment is complex [70].
  • Development Challenges
The advancement of AR/VR in the industrial sector faces a multitude of challenges that hinder its progression. Primarily, the absence of universally applicable methods for designing applications is a notable impediment. AR development typically operates on a bespoke, case-by-case basis, tailored to the unique requirements of individual systems. This approach can prove to be cost-prohibitive, particularly when considering the need to design and implement distinct applications for various sectors within a manufacturing plant [10]. The absence of a ‘one-size-fits-all’ solution also extends to the compatibility of AR/VR software with different hardware platforms. When developing an application, the intricacies of setup procedures are intricately tied to the specific software and hardware combination for which the application is intended. For instance, an application designed for the HoloLens platform cannot be seamlessly transferred to another Head-Mounted Display (HMD) device [2]. These developmental challenges are further exacerbated by the demand for job-specific solutions. Diverse job roles necessitate distinct hardware and software requirements, influenced by factors such as environmental conditions, machinery configurations, and the availability of specific models. Achieving a harmonious approach to application development across different areas within a singular environment remains a formidable task. Currently, the industry lacks convenient ‘drag-and-drop’ solutions, which could potentially reduce employment costs and standardize application development practices.
  • Cybersecurity threat
The cybersecurity threat is among the top concerns for implementing AR/VR-based manufacturing applications. Confidentiality, integrity, and availability (CIA), data theft, observation attacks for graphical PIN in 2D, security, privacy, and safety (SPS), granular authentication and authorization, and latency problems are the most frequent cyber security concerns in AR/VR systems [71]. Therefore, careful development needs to be done for AR/VR-based manufacturing. If the system gets exposed attackers can easily get secret information that might jeopardize any manufacturing company with substantial financial loss.

3.4.2. Organizational Challenges

  • User acceptance
In the field of manufacturing, Augmented Reality (AR) and Virtual Reality (VR) are very new technologies that have huge transformative potential. Given the novelty of the technologies, acceptance and adoption by end-users are imperative for the successful integration of these technologies. Consequently, many scholarly research articles focused on understanding the user acceptance and the usability of AR/VR applications within the context of manufacturing contexts. Various researchers have adopted different systematic approaches to adopt AR/VR systems tailored to manufacturing environments and emphasized the critical need for professional and strategic implementation of these technologies. Authors of [21,72], identified an architecture framework for the systematic development of AR applications. This framework can also be adopted for developing VR applications. The framework consists of five stages: concept and theory stage, implementation stage, evaluation stage, industry adaptation stage, and intelligent application solution stage [21,72]. In each phase of the architecture framework, careful considerations need to be taken and user evaluation needs to be done so that the applications can be well accepted in the manufacturing setting, and these are very challenging tasks. For this reason, AR and VR technologies are facing challenges to get adopted in the manufacturing industries.
Many of the papers found conducted field studies to examine how their application improved different metrics and how the users reacted to use. These studies ranged from 5 to 50 users at a time. While these some studies were conducted in large groups, they did not last for an extent of time that could give significant findings. Large scale field research with long usage periods must be studied to find how they would do in industry. This could improve how they are adopted not only by users but by companies as well.
  • Return on investment
AR/VR can prove to bring large returns on investment, especially with large overturn rates involved. However, this is dependent on accuracy, ease of use, and speed at which the application can be developed. The accuracy and ease of use of the applications can have an overall effect on the knowledge retention of the user. This cannot be evaluated at a higher level as there have not been studies with large enough field groups to determine if the applications are effective. Currently, most studies are done based on time to conduct tasks. While this is a good metric, there is more that can be done when determining the effectiveness of AR/VR. At the moment, there is also a lack of common knowledge on the use of AR/VR as it is an emerging technology in the public. This does not mean that people don’t know what it is, but there is a lack of knowledge of common practices or maneuvers when using applications, which causes a learning curve when implementing one of these systems.

3.4.3. Environmental Challenges

The environment has many different factors that can prove to be difficult for AR/VR systems to handle. This can include factors such as lighting, atmosphere, moving bodies, etc. Of these, occlusion seems to be one that gets a lot of attention. Occlusion can involve having to exclude objects in the field of view of the user like other people, the users’ hands, and other structures in the environment. As the AR system is the last layer of vision for the user, without occlusion handling, the virtual aspects can be displayed on top of objects that it shouldn’t. This can disrupt the flow of the application and the user experience [73]. Occlusion systems also cause larger processing strains on the system which reduces the realism of the simulation and in some cases causes discomfort to the user. There have been many proposed solutions that involve the occlusion of specific types of objects in the field of view but have yet to have an all-inclusive solution [26]. The use of CAD-based tracking has also been seen to improve partial occlusions and rapid motion, but this relies on the availability of CAD models [10].
The identified emerging technologies and problems being faced with AR and VR discussed can be seen to correlate directly with each other. While many of the technologies are being researched individually, they should be looked at alongside each other. This would allow for different scenarios to be looked at with multiple problems being faced. For example, many of the papers choose a single tracking type when developing the application when this might not be the most optimal solution. Evaluating different challenges while being developed and tested during reporting would allow for newer papers to supersede these problems. The challenges being faced by AR and VR are often discussed apart from each other. This paper proves to show that they often face the same problems. When conducting literature reviews in the future, researchers can look into both realms to find challenges and solutions.

4. Conclusion

In this paper, a detailed examination of the applications of Augmented Reality (AR) and Virtual Reality (VR) within the context of the manufacturing industry was conducted. Both software and hardware components utilized in these applications were meticulously categorized and identified.
Research Question 1 (RQ1) systemically identified the current key applications of AR/VR in manufacturing by reviewing and analyzing various studies and examples. AR/VR applications enhance manufacturing processes, including maintenance, assembly, operations, training, product design, and quality control. The potential of AR/VR technologies extends beyond the identified core areas which includes inventory management, supply chain optimization, and customer engagement. These extended applications are not extensively covered in the reviewed literature which will be explored in future research.
Research Question 2 (RQ2) identifies the state-of-the-art technologies in AR/VR applications for manufacturing, offering a comprehensive overview of key components, hardware, and software utilized in this domain. Cutting-edge technologies were examined which include tracking systems, display devices, game engines, AR libraries, and VR development kits.
Research Question 3 (RQ3) highlights emerging technologies in AR/VR applications for manufacturing, providing a detailed exploration of cutting-edge advancements such as Edge applications, AI-based applications, Digital Twin applications, Teleportation, Remote Collaboration applications, and Human-Robot Collaboration applications. By synthesizing insights from various studies, the review identifies key technological trends of AR/VR technologies in modern manufacturing practices. However, challenges exist in implementing these technologies. There is a lack of standardized architectures for their implementation, and challenges related to interoperability and scalability persist. More industry-specific case studies and applications are needed for the large-scale adoption of these technologies.
Research Question 4 (RQ4) identifies challenges across technological, organizational, and environmental domains for the implementation of AR/VR-based applications in manufacturing. The section highlights the key technological hurdles like tracking and registration complexities, device selection and evaluation, development challenges, and cybersecurity threats that impact AR/VR implementation in manufacturing. Additionally, it highlights organizational obstacles such as user acceptance and return on investment, underscoring the importance of strategic frameworks for technology adoption. Research gaps remain in standardizing architectures and benchmarking tools for AR/VR devices in manufacturing, as well as in developing industry-specific case studies to facilitate broader technology adoption. Further research is needed to address environmental challenges like occlusion handling and other factors influencing AR/VR usability in manufacturing settings.
In conclusion, this paper provides a comprehensive overview of the applications, technologies, and challenges associated with implementing AR/VR in the manufacturing sector. Key findings highlight the diverse range of applications within manufacturing, from enhancing operations to exploring emerging technologies. Despite technological advancements, challenges such as tracking complexities and user acceptance barriers underscore the need for further research and standardized frameworks to maximize the potential of AR/VR in manufacturing.

Acknowledgements

This work is funded in part by NSF Award 2119654 “RII Track 2 FEC: Enabling Factory to Factory (F2F) Networking for Future Manufacturing,” and “Enabling Factory to Factory (F2F) Networking for Future Manufacturing across South Carolina,” funded by South Carolina Research Authority. Any opinions, findings, conclusions, or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the sponsors.

Declaration of Competing Interest

The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.

Disclosure of A.I Usage

During the preparation of this work, the author(s) used “ChatGPT” to improve the readability, grammar correction, and language of the paper. After using this tool/service, the author(s) reviewed and edited the content as needed and take(s) full responsibility for the content of the publication.

References

  1. V. R. Relji’c, I. M. Milenkovi’c, S. Dudi’cdudi’c, J. Šulc, and B. Bajči, “Augmented Reality Applications in Industry 4.0 Environment,” Applied Sciences 2021, Vol. 11, Page 5592, vol. 11, no. 12, p. 5592, Jun. 2021. [CrossRef]
  2. J. Egger and T. Masood, “Augmented reality in support of intelligent manufacturing – A systematic literature review,” Comput Ind Eng, vol. 140, p. 106195, Feb. 2020. [CrossRef]
  3. S. C. Y. Lu, M. Shpitalni, and R. Gadh, “Virtual and Augmented Reality Technologies for Product Realization,” CIRP Annals, vol. 48, no. 2, pp. 471–495, Jan. 1999. [CrossRef]
  4. P. MILGRAM and F. KISHINO, “A Taxonomy of Mixed Reality Visual Displays,” IEICE Trans Inf Syst, vol. E77-D, no. 12, pp. 1321–1329, Dec. 1994, Accessed: Nov. 26, 2022. [Online]. Available: https://search.ieice.org/bin/summary.php?id=e77-d_12_1321&category=D&year=1994&lang=E&abst=.
  5. R. Azuma, Y. Baillot, R. Behringer, S. Feiner, S. Julier, and B. MacIntyre, “Recent advances in augmented reality,” IEEE Comput Graph Appl, vol. 21, no. 6, pp. 34–47, Nov. 2001. [CrossRef]
  6. “Virtual reality - Wikipedia.” Accessed: Sep. 12, 2023. [Online]. Available: https://en.wikipedia.org/wiki/Virtual_reality.
  7. “Virtual Reality: The Promising Future of Immersive Technology.” Accessed: Sep. 12, 2023. [Online]. Available: https://www.g2.com/articles/virtual-reality.
  8. M. Neges and C. Koch, “Augmented reality supported work instructions for onsite facility maintenance,” Jun. 30, 2016. Accessed: Oct. 10, 2022. [Online]. Available: https://nottingham-repository.worktribe.com/index.php/output/792905/augmented-reality-supported-work-instructions-for-onsite-facility-maintenance.
  9. Md. F. Rahman, R. Pan, J. Ho, and T.-L. (Bill) Tseng, “A Review of Augmented Reality Technology and its Applications in Digital Manufacturing,” SSRN Electronic Journal, Mar. 2022. [CrossRef]
  10. R. Palmarini, J. A. Erkoyuncu, R. Roy, and H. Torabmostaedi, “A systematic review of augmented reality applications in maintenance,” Robot Comput Integr Manuf, vol. 49, pp. 215–228, Feb. 2018. [CrossRef]
  11. A. Booth, A. Sutton, M. Clowes, and M. M.-S. James, “Systematic approaches to a successful literature review,” 2021, Accessed: Nov. 26, 2022. [Online]. Available: https://books.google.com/books?hl=en&lr=&id=SiExEAAAQBAJ&oi=fnd&pg=PT25&ots=vrXzyg4FZH&sig=ij_jOB6bZgPdfGUfgUdfionJuQo.
  12. D. Mourtzis, J. Angelopoulos, and N. Panopoulos, “A Framework for Automatic Generation of Augmented Reality Maintenance & Repair Instructions based on Convolutional Neural Networks,” Procedia CIRP, vol. 93, pp. 977–982, Jan. 2020. [CrossRef]
  13. X. Wang, S. K. Ong, and A. Y. C. Nee, “Multi-modal augmented-reality assembly guidance based on bare-hand interface,” Advanced Engineering Informatics, vol. 30, no. 3, pp. 406–421, Aug. 2016. [CrossRef]
  14. M. Fiaz and S. K. Jung, “Handcrafted and Deep Trackers: Recent Visual Object Tracking Approaches and Trends,” ACM Comput Surv, vol. 52, no. 2, p. 43, 2019. [CrossRef]
  15. R. De Amicis, A. Ceruti, D. Francia, L. Frizziero, and B. Simões, “Augmented Reality for virtual user manual,” International Journal on Interactive Design and Manufacturing, vol. 12, no. 2, pp. 689–697, May 2018. [CrossRef]
  16. B. Camburn et al., “Design prototyping methods: state of the art in strategies, techniques, and guidelines,” Design Science, vol. 3, p. e13, 2017. [CrossRef]
  17. L. Kent, C. Snider, J. Gopsill, and B. Hicks, “Mixed reality in design prototyping: A systematic review,” Des Stud, vol. 77, p. 101046, Nov. 2021. [CrossRef]
  18. A. Berni and Y. Borgianni, “Applications of Virtual Reality in Engineering and Product Design: Why, What, How, When and Where,” Electronics 2020, Vol. 9, Page 1064, vol. 9, no. 7, p. 1064, Jun. 2020. [CrossRef]
  19. U. Urbas, R. Vrabič, and N. Vukašinović, “Displaying Product Manufacturing Information in Augmented Reality for Inspection,” Procedia CIRP, vol. 81, pp. 832–837, Jan. 2019. [CrossRef]
  20. M. Schumann, C. Fuchs, C. Kollatsch, and P. Klimant, “Evaluation of augmented reality supported approaches for product design and production processes,” Procedia CIRP, vol. 97, pp. 160–165, Jan. 2021. [CrossRef]
  21. P. T. Ho, J. A. Albajez, J. Santolaria, and J. A. Yagüe-Fabra, “Study of Augmented Reality Based Manufacturing for Further Integration of Quality Control 4.0: A Systematic Literature Review,” Applied Sciences 2022, Vol. 12, Page 1961, vol. 12, no. 4, p. 1961, Feb. 2022. [CrossRef]
  22. Y. Zhao, X. An, and N. Sun, “Virtual simulation experiment of the design and manufacture of a beer bottle-defect detection system.,” Virtual Reality & Intelligent Hardware, vol. 2, no. 4, pp. 354–367, Aug. 2020. [CrossRef]
  23. F. Ferraguti et al., “Augmented reality based approach for on-line quality assessment of polished surfaces,” Robot Comput Integr Manuf, vol. 59, pp. 158–167, Oct. 2019. [CrossRef]
  24. A. E. Uva, M. Gattullo, V. M. Manghisi, D. Spagnulo, G. L. Cascella, and M. Fiorentino, “Evaluating the effectiveness of spatial augmented reality in smart manufacturing: a solution for manual working stations,” The International Journal of Advanced Manufacturing Technology 2017 94:1, vol. 94, no. 1, pp. 509–521, Feb. 2017. [CrossRef]
  25. M. Lorenz, S. Knopp, and P. Klimant, “Industrial Augmented Reality: Requirements for an Augmented Reality Maintenance Worker Support System,” IEEE Access, 2018.
  26. M. Eswaran and M. V. A. R. Bahubalendruni, “Challenges and opportunities on AR/VR technologies for manufacturing systems in the context of industry 4.0: A state of the art review,” J Manuf Syst, vol. 65, pp. 260–278, Oct. 2022. [CrossRef]
  27. J. K. Ford and T. Höllerer, “Augmented Reality and the Future of Virtual Workspaces,” https://services.igi-global.com/resolvedoi/resolve.aspx?doi=10.4018/978-1-59904-893-2.ch034, pp. 486–502, Jan. 1AD. [CrossRef]
  28. P. Rupprecht, H. Kueffner-Mccauley, M. Trimmel, and S. Schlund, “Adaptive Spatial Augmented Reality for Industrial Site Assembly,” Procedia CIRP, vol. 104, pp. 405–410, Jan. 2021. [CrossRef]
  29. A. Blaga, C. Militaru, A. D. Mezei, and L. Tamas, “Augmented reality integration into MES for connected workers,” Robot Comput Integr Manuf, vol. 68, p. 102057, Feb. 2021. [CrossRef]
  30. E. Tzimas, G. C. Vosniakos, and E. Matsas, “Machine tool setup instructions in the smart factory using augmented reality: a system construction perspective,” International Journal on Interactive Design and Manufacturing, vol. 13, no. 1, pp. 121–136, Feb. 2019. [CrossRef]
  31. “VR Hardware.” Accessed: Sep. 12, 2023. [Online]. Available: https://www.hitl.washington.edu/projects/learning_center/pf/whatvr1.htm#id.
  32. “Top 10 Virtual Reality Software Development Tools.” Accessed: Dec. 01, 2022. [Online]. Available: https://beam.eyeware.tech/top-10-virtual-reality-software-development-tools-gamers/.
  33. “GitHub - ValveSoftware/openvr: OpenVR SDK.” Accessed: Sep. 13, 2023. [Online]. Available: https://github.com/ValveSoftware/openvr.
  34. “The best 5 VR SDKs for Interactions for Unity & Unreal.” Accessed: Sep. 13, 2023. [Online]. Available: https://xrbootcamp.com/the-best-5-vr-sdk-for-interactions/.
  35. K. Stecuła, “Virtual Reality Applications Market Analysis—On the Example of Steam Digital Platform,” Informatics 2022, Vol. 9, Page 100, vol. 9, no. 4, p. 100, Dec. 2022. [CrossRef]
  36. E. Bottani et al., “Wearable and interactive mixed reality solutions for fault diagnosis and assistance in manufacturing systems: Implementation and testing in an aseptic bottling line,” Comput Ind, vol. 128, p. 103429, Jun. 2021. [CrossRef]
  37. L. Pérez, E. Diez, R. Usamentiaga, and D. F. García, “Industrial robot control and operator training using virtual reality interfaces,” Comput Ind, vol. 109, pp. 114–120, Aug. 2019. [CrossRef]
  38. J. Wang, Y. Ma, L. Zhang, R. X. Gao, and D. Wu, “Deep learning for smart manufacturing: Methods and applications,” J Manuf Syst, vol. 48, pp. 144–156, Jul. 2018. [CrossRef]
  39. C. K. Sahu, C. Young, and R. Rai, “Artificial intelligence (AI) in augmented reality (AR)-assisted manufacturing applications: a review,”, vol. 59, no. 16, pp. 4903–4959, 2020. [CrossRef]
  40. P. Dollar, R. Appel, S. Belongie, and P. Perona, “Fast feature pyramids for object detection,” IEEE Trans Pattern Anal Mach Intell, vol. 36, no. 8, pp. 1532–1545, 2014. [CrossRef]
  41. K. B. Park, M. Kim, S. H. Choi, and J. Y. Lee, “Deep learning-based smart task assistance in wearable augmented reality,” Robot Comput Integr Manuf, vol. 63, Jun. 2020. [CrossRef]
  42. R. Girshick, J. Donahue, T. Darrell, J. Malik, U. C. Berkeley, and J. Malik, “Rich Feature Hierarchies for Accurate Object Detection and Semantic Segmentation,” Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 1, p. 5000, Sep. 2014. [CrossRef]
  43. Z. H. Lai, W. Tao, M. C. Leu, and Z. Yin, “Smart augmented reality instructional system for mechanical assembly towards worker-centered intelligent manufacturing,” J Manuf Syst, vol. 55, pp. 69–81, Apr. 2020. [CrossRef]
  44. “YOLOv5 | PyTorch.” Accessed: Sep. 17, 2023. [Online]. Available: https://pytorch.org/hub/ultralytics_yolov5/.
  45. J. Lambrecht, L. Kästner, J. Guhl, and J. Krüger, “Towards commissioning, resilience and added value of Augmented Reality in robotics: Overcoming technical obstacles to industrial applicability,” Robot Comput Integr Manuf, vol. 71, p. 102178, Oct. 2021. [CrossRef]
  46. F. De Felice, A. R. Cannito, D. Monte, and F. Vitulano, “S.A.M.I.R.: Supporting Tele-Maintenance with Integrated Interaction Using Natural Language and Augmented Reality,” Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), vol. 12936 LNCS, pp. 280–284, 2021. [CrossRef]
  47. J. Izquierdo-Domenech, J. Linares-Pellicer, and J. Orta-Lopez, “Towards achieving a high degree of situational awareness and multimodal interaction with AR and semantic AI in industrial applications,” Multimed Tools Appl, vol. 82, no. 10, pp. 15875–15901, Apr. 2023. [CrossRef]
  48. S. Akbarinasaji and E. Homayounvala, “A novel context-aware augmented reality framework for maintenance systems,” J Ambient Intell Smart Environ, vol. 9, no. 3, pp. 315–327, 2017. [CrossRef]
  49. A. J. C. Trappey, C. V. Trappey, M. H. Chao, and C. T. Wu, “VR-enabled engineering consultation chatbot for integrated and intelligent manufacturing services,” J Ind Inf Integr, vol. 26, p. 100331, Mar. 2022. [CrossRef]
  50. G. Boboc et al., “A VR-Enabled Chatbot Supporting Design and Manufacturing of Large and Complex Power Transformers,” Electronics 2022, Vol. 11, Page 87, vol. 11, no. 1, p. 87, Dec. 2021. [CrossRef]
  51. E. VanDerHorn and S. Mahadevan, “Digital Twin: Generalization, characterization and implementation,” Decis Support Syst, vol. 145, p. 113524, Jun. 2021. [CrossRef]
  52. S. K. Ong, A. Y. C. Nee, A. W. W. Yew, and N. K. Thanigaivel, “AR-assisted robot welding programming,” Advances in Manufacturing 2019 8:1, vol. 8, no. 1, pp. 40–48, Nov. 2019. [CrossRef]
  53. A. A. Malik, T. Masood, and A. Bilberg, “Virtual reality in manufacturing: immersive and collaborative artificial-reality in design of human-robot workspace,” vol. 33, no. 1, pp. 22–37, Jan. 2019. [CrossRef]
  54. X. V. Wang, L. Wang, M. Lei, and Y. Zhao, “Closed-loop augmented reality towards accurate human-robot collaboration,” CIRP Annals, vol. 69, no. 1, pp. 425–428, Jan. 2020. [CrossRef]
  55. P. Wang et al., “AR/MR Remote Collaboration on Physical Tasks: A Review,” Robot Comput Integr Manuf, vol. 72, p. 102071, Dec. 2021. [CrossRef]
  56. S. Arevalo and F. Rucker, “Assisting manipulation and grasping in robot teleoperation with augmented reality visual cues,” Conference on Human Factors in Computing Systems - Proceedings, May 2021. [CrossRef]
  57. S. G. Hart and L. E. Staveland, “Development of NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research,” Advances in Psychology, vol. 52, no. C, pp. 139–183, Jan. 1988. [CrossRef]
  58. S. H. Choi, M. Kim, and J. Y. Lee, “Situation-dependent remote AR collaborations: Image-based collaboration using a 3D perspective map and live video-based collaboration with a synchronized VR mode,” Comput Ind, vol. 101, pp. 51–66, Oct. 2018. [CrossRef]
  59. D. Ni, A. Y. C. Nee, S. K. Ong, H. Li, C. Zhu, and A. Song, “Point cloud augmented virtual reality environment with haptic constraints for teleoperation,” Transactions of the Institute of Measurement and Control, vol. 40, no. 15, pp. 4091–4104, Nov. 2018. [CrossRef]
  60. C. Li, P. Zheng, S. Li, Y. Pang, and C. K. M. Lee, “AR-assisted digital twin-enabled robot collaborative manufacturing system with human-in-the-loop,” Robot Comput Integr Manuf, vol. 76, p. 102321, Aug. 2022. [CrossRef]
  61. S. H. Choi et al., “An integrated mixed reality system for safety-aware human-robot collaboration using deep learning and digital twin generation,” Robot Comput Integr Manuf, vol. 73, Feb. 2022. [CrossRef]
  62. E. Marino, L. Barbieri, B. Colacino, A. K. Fleri, and F. Bruno, “An Augmented Reality inspection tool to support workers in Industry 4.0 environments,” Comput Ind, vol. 127, p. 103412, Feb. 2021. [CrossRef]
  63. D. Tatić and B. Tešić, “The application of augmented reality technologies for the improvement of occupational safety in an industrial environment,” Comput Ind, vol. 85, pp. 1–10, Feb. 2017. [CrossRef]
  64. H. Durchon, M. Preda, T. Zaharia, and Y. Grall, “Challenges in Applying Deep Learning to Augmented Reality for Manufacturing,” Proceedings - Web3D 2022: 27th ACM Conference on 3D Web Technology, Nov. 2022. [CrossRef]
  65. M. Holm, O. Danielsson, A. Syberfeldt, P. Moore, and L. Wang, “Adaptive instructions to novice shop-floor operators using Augmented Reality,”. vol. 34, no. 5, pp. 362–374, Feb. 2017. [CrossRef]
  66. C. Y. Siew, S. K. Ong, and A. Y. C. Nee, “A practical augmented reality-assisted maintenance system framework for adaptive user support,” Robot Comput Integr Manuf, vol. 59, pp. 115–129, Feb. 2019. [CrossRef]
  67. X. Wang, S. K. Ong, and A. Y. C. Nee, “A comprehensive survey of augmented reality assembly research,” Adv Manuf, vol. 4, no. 1, pp. 1–22, Mar. 2016. [CrossRef]
  68. C. Chen et al., “A Quick Development Toolkit for Augmented Reality Visualization (QDARV) of a Factory,” Applied Sciences 2022, Vol. 12, Page 8338, vol. 12, no. 16, p. 8338, Aug. 2022. [CrossRef]
  69. A. Syberfeldt, O. Danielsson, and P. Gustavsson, “Augmented Reality Smart Glasses in the Smart Factory: Product Evaluation Guidelines and Review of Available Products,” IEEE Access, vol. 5, pp. 9118–9130, 2017. [CrossRef]
  70. M. Otto, E. Lampen, P. Agethen, M. Langohr, G. Zachmann, and E. Rukzio, “A Virtual Reality Assembly Assessment Benchmark for Measuring VR Performance & Limitations,” Procedia CIRP, vol. 81, pp. 785–790, Jan. 2019. [CrossRef]
  71. A. Alismail, E. Altulaihan, M. M. H. Rahman, and A. Sufian, “A Systematic Literature Review on Cybersecurity Threats of Virtual Reality (VR) and Augmented Reality (AR),” pp. 761–774, 2023. [CrossRef]
  72. X. Wang, M. J. Kim, P. E. D. Love, and S. C. Kang, “Augmented Reality in built environment: Classification and implications for future research,” Autom Constr, vol. 32, pp. 1–13, Jul. 2013. [CrossRef]
  73. C. K. Yang, Y. H. Chen, T. J. Chuang, K. Shankhwar, and S. Smith, “An augmented reality-based training system with a natural user interface for manual milling operations,” Virtual Real, vol. 24, no. 3, pp. 527–539, Sep. 2020. [CrossRef]
  74. J. M. Runji and C. Y. Lin, “Markerless cooperative augmented reality-based smart manufacturing double-check system: Case of safe PCBA inspection following automatic optical inspection,” Robot Comput Integr Manuf, vol. 64, p. 101957, Aug. 2020. [CrossRef]
  75. C. Liu et al., “Probing an intelligent predictive maintenance approach with deep learning and augmented reality for machine tools in IoT-enabled manufacturing,” Robot Comput Integr Manuf, vol. 77, p. 102357, Oct. 2022. [CrossRef]
  76. M. Dalle Mura and G. Dini, “An augmented reality approach for supporting panel alignment in car body assembly,” J Manuf Syst, vol. 59, pp. 251–260, Apr. 2021. [CrossRef]
  77. S. Li, P. Zheng, and L. Zheng, “An AR-Assisted Deep Learning-Based Approach for Automatic Inspection of Aviation Connectors,” IEEE Trans Industr Inform, vol. 17, no. 3, pp. 1721–1731, Mar. 2021. [CrossRef]
  78. A. Muñoz, A. Martí, X. Mahiques, L. Gracia, J. E. Solanes, and J. Tornero, “Camera 3D positioning mixed reality-based interface to improve worker safety, ergonomics and productivity,” CIRP J Manuf Sci Technol, vol. 28, pp. 24–37, Jan. 2020. [CrossRef]
  79. A. Muñoz, X. Mahiques, J. E. Solanes, A. Martí, L. Gracia, and J. Tornero, “Mixed reality-based user interface for quality control inspection of car body surfaces,” J Manuf Syst, vol. 53, pp. 75–92, Oct. 2019. [CrossRef]
  80. K. Li et al., “AR-Aided Smart Sensing for In-Line Condition Monitoring of IGBT Wafer,” IEEE Transactions on Industrial Electronics, vol. 66, no. 10, pp. 8197–8204, Oct. 2019. [CrossRef]
  81. C. Hofmann, T. Staehr, S. Cohen, N. Stricker, B. Haefner, and G. Lanza, “Augmented Go & See: An approach for improved bottleneck identification in production lines,” Procedia Manuf, vol. 31, pp. 148–154, Jan. 2019. [CrossRef]
  82. A. Burova et al., “Asynchronous industrial collaboration: How virtual reality and virtual tools aid the process of maintenance method development and documentation creation,” Comput Ind, vol. 140, p. 103663, Sep. 2022. [CrossRef]
  83. P. Zheng et al., “An Augmented Reality-Assisted Prognostics and Health Management System Based on Deep Learning for IoT-Enabled Manufacturing,” Sensors 2022, Vol. 22, Page 6472, vol. 22, no. 17, p. 6472, Aug. 2022. [CrossRef]
  84. D. Mourtzis, J. Angelopoulos, and V. Zogopoulos, “Integrated and adaptive AR maintenance and shop-floor rescheduling,” Comput Ind, vol. 125, p. 103383, Feb. 2021. [CrossRef]
  85. D. Mourtzis, A. Vlachou, and V. Zogopoulos, “Cloud-based augmented reality remote maintenance through shop-floor monitoring: A product-service system approach,” Journal of Manufacturing Science and Engineering, Transactions of the ASME, vol. 139, no. 6, Feb. 2017. [CrossRef]
  86. J. A. Erkoyuncu, I. F. del Amo, M. D. Mura, R. Roy, and G. Dini, “Improving efficiency of industrial maintenance with context aware adaptive authoring in augmented reality,” CIRP Annals, vol. 66, no. 1, pp. 465–468, Feb. 2017. [CrossRef]
  87. G. W. Scurati, M. Gattullo, M. Fiorentino, F. Ferrise, M. Bordegoni, and A. E. Uva, “Converting maintenance actions into standard symbols for Augmented Reality applications in Industry 4.0,” Comput Ind, vol. 98, pp. 68–79, Jun. 2018. [CrossRef]
  88. C. Y. Siew, A. Y. C. Nee, and S. K. Ong, “Improving Maintenance Efficiency with an Adaptive AR-assisted Maintenance System,” Proceedings of the 2019 4th International Conference on Robotics, Control and Automation, 2019. [CrossRef]
  89. M. Gattullo, G. W. Scurati, M. Fiorentino, A. E. Uva, F. Ferrise, and M. Bordegoni, “Towards augmented reality manuals for industry 4.0: A methodology,” Robot Comput Integr Manuf, vol. 56, pp. 276–286, Apr. 2019. [CrossRef]
  90. L. Gong, Å. Fast-Berglund, and B. Johansson, “A Framework for Extended Reality System Development in Manufacturing,” IEEE Access, vol. 9, pp. 24796–24813, 2021. [CrossRef]
  91. A. Malta, M. Mendes, and T. Farinha, “Augmented Reality Maintenance Assistant Using YOLOv5,” Applied Sciences 2021, Vol. 11, Page 4758, vol. 11, no. 11, p. 4758, May 2021. [CrossRef]
  92. A. Ceruti, P. Marzocca, A. Liverani, and C. Bil, “Maintenance in aeronautics in an Industry 4.0 context: The role of Augmented Reality and Additive Manufacturing q,” 2019. [CrossRef]
  93. W. Vorraber, J. Gasser, H. Webb, D. Neubacher, and P. Url, “Assessing augmented reality in production: remote-assisted maintenance with HoloLens,” Procedia CIRP, vol. 88, pp. 139–144, Jan. 2020. [CrossRef]
  94. N. Koteleva, G. Buslaev, V. Valnev, and A. Kunshin, “Augmented Reality System and Maintenance of Oil Pumps,” International Journal of Engineering, vol. 33, no. 8, pp. 1620–1628, Feb. 2020. [CrossRef]
  95. M. Eswaran, A. K. Gulivindala, A. K. Inkulu, and M. V. A. Raju Bahubalendruni, “Augmented reality-based guidance in product assembly and maintenance/repair perspective: A state of the art review on challenges and opportunities,” Expert Syst Appl, vol. 213, p. 118983, Mar. 2023. [CrossRef]
  96. S. E. Scheffer, A. Martinetti, R. G. J. Damgrave, and L. A. M. van Dongen, “Supporting maintenance operators using augmented reality decision-making: visualize, guide, decide & track,” Procedia CIRP, vol. 119, pp. 782–787, Jan. 2023. [CrossRef]
  97. J. Simon, L. Gogolák, J. Sárosi, and I. Fürstner, “Augmented Reality Based Distant Maintenance Approach,” Actuators 2023, Vol. 12, Page 302, vol. 12, no. 7, p. 302, Jul. 2023. [CrossRef]
  98. T. Siriborvornratanakul, “Enhancing user experiences of mobile-based augmented reality via spatial augmented reality: Designs and architectures of projector-camera devices,” Advances in Multimedia, vol. 2018, 2018. [CrossRef]
  99. D. Mourtzis, V. Zogopoulos, and E. Vlachou, “Augmented Reality supported Product Design towards Industry 4.0: a Teaching Factory paradigm,” Procedia Manuf, vol. 23, pp. 207–212, Jan. 2018. [CrossRef]
  100. L. P. Berg and J. M. Vance, “An Industry Case Study: Investigating Early Design Decision Making in Virtual Reality,” J Comput Inf Sci Eng, vol. 17, no. 1, Mar. 2017. [CrossRef]
  101. D. Mourtzis, V. Siatras, J. Angelopoulos, and N. Panopoulos, “An Augmented Reality Collaborative Product Design Cloud-Based Platform in the Context of Learning Factory,” Procedia Manuf, vol. 45, pp. 546–551, Jan. 2020. [CrossRef]
  102. V. Ivanov, I. Pavlenko, O. Liaposhchenko, O. Gusak, and V. Pavlenko, “Determination of contact points between workpiece and fixture elements as a tool for augmented reality in fixture design,” Wireless Networks, vol. 27, no. 3, pp. 1657–1664, Feb. 2021. [CrossRef]
  103. X. Chen, L. Gong, A. Berce, B. Johansson, and M. Despeisse, “Implications of Virtual Reality on Environmental Sustainability in Manufacturing Industry: A Case Study,” Procedia CIRP, vol. 104, pp. 464–469, Jan. 2021. [CrossRef]
  104. L. Dammacco, R. Carli, V. Lazazzera, M. Fiorentino, and M. Dotoli, “Designing complex manufacturing systems by virtual reality: A novel approach and its application to the virtual commissioning of a production line,” Comput Ind, vol. 143, p. 103761, Dec. 2022. [CrossRef]
  105. V. Alejandro Huerta-Torruco, Ó. Hernández-Uribe, L. Adriana Cárdenas-Robledo, and N. Amir Rodríguez-Olivares, “Effectiveness of virtual reality in discrete event simulation models for manufacturing systems,” Comput Ind Eng, vol. 168, p. 108079, Jun. 2022. [CrossRef]
  106. C. Anagnostopoulos, G. Mylonas, A. P. Fournaris, and C. Koulamas, “A Design Approach and Prototype Implementation for Factory Monitoring Based on Virtual and Augmented Reality at the Edge of Industry 4.0,” IEEE International Conference on Industrial Informatics (INDIN), vol. 2023-July, 2023. [CrossRef]
  107. M. Hovanec, P. Korba, M. Vencel, and S. Al-Rabeei, “Simulating a Digital Factory and Improving Production Efficiency by Using Virtual Reality Technology,” Applied Sciences 2023, Vol. 13, Page 5118, vol. 13, no. 8, p. 5118, Apr. 2023. [CrossRef]
  108. U. Auyeskhan, C. A. Steed, S. Park, D. H. Kim, I. D. Jung, and N. Kim, “Virtual reality-based assembly-level design for additive manufacturing decision framework involving human aspects of design,” J Comput Des Eng, vol. 10, no. 3, pp. 1126–1142, Apr. 2023. [CrossRef]
  109. G. Westerfield, A. Mitrovic, and M. Billinghurst, “Intelligent augmented reality training for motherboard assembly,” Int J Artif Intell Educ, vol. 25, no. 1, pp. 157–172, Feb. 2015. [CrossRef]
  110. X. Qian, J. Tu, and P. Lou, “A general architecture of a 3D visualization system for shop floor management,” J Intell Manuf, vol. 30, no. 4, pp. 1531–1545, Feb. 2019. [CrossRef]
  111. F. Bruno, L. Barbieri, E. Marino, M. Muzzupappa, L. D’Oriano, and B. Colacino, “An augmented reality tool to detect and annotate design variations in an Industry 4.0 approach,” International Journal of Advanced Manufacturing Technology, vol. 105, no. 1–4, pp. 875–887, Feb. 2019. [CrossRef]
  112. Z. Zhu, C. Liu, and X. Xu, “Visualisation of the Digital Twin data in manufacturing by using Augmented Reality,” Procedia CIRP, vol. 81, pp. 898–903, Feb. 2019. [CrossRef]
  113. C. González, J. E. Solanes, A. Muñoz, L. Gracia, V. Girbés-Juan, and J. Tornero, “Advanced teleoperation and control system for industrial robots based on augmented virtuality and haptic feedback,” J Manuf Syst, vol. 59, pp. 283–298, Apr. 2021. [CrossRef]
  114. K. Lotsaris et al., “Augmented Reality (AR) based framework for supporting human workers in flexible manufacturing,” Procedia CIRP, vol. 96, pp. 301–306, Jan. 2021. [CrossRef]
  115. T. Perdpunya, S. Nuchitprasitchai, and P. Boonrawd, “Augmented Reality with Mask R-CNN (ARR-CNN) inspection for Intelligent Manufacturing,” ACM International Conference Proceeding Series, Jun. 2021. [CrossRef]
  116. F. He, S. K. Ong, and A. Y. C. Nee, “An Integrated Mobile Augmented Reality Digital Twin Monitoring System,” Computers 2021, Vol. 10, Page 99, vol. 10, no. 8, p. 99, Aug. 2021. [CrossRef]
  117. T. Treinen and S. S. V. K. Kolla, “Augmented Reality for Quality Inspection, Assembly and Remote Assistance in Manufacturing,” Procedia Comput Sci, vol. 232, pp. 533–543, Jan. 2024. [CrossRef]
  118. X. Yang, J. Yang, H. He, and H. Chen, “A Hybrid 3D Registration Method of Augmented Reality for Intelligent Manufacturing,” IEEE Access, vol. 7, pp. 181867–181883, 2019. [CrossRef]
  119. W. Tao, Z. H. Lai, M. C. Leu, Z. Yin, and R. Qin, “A self-aware and active-guiding training & assistant system for worker-centered intelligent manufacturing,” Manuf Lett, vol. 21, pp. 45–49, Aug. 2019. [CrossRef]
  120. F. Longo, L. Nicoletti, and A. Padovano, “Smart operators in industry 4.0: A human-centered approach to enhance operators’ capabilities and competencies within the new smart factory context,” Comput Ind Eng, vol. 113, pp. 144–159, Feb. 2017. [CrossRef]
  121. K. Helin, T. Kuula, C. Vizzi, J. Karjalainen, and A. Vovk, “User experience of augmented reality system for astronaut’s manual work support,” Frontiers Robotics AI, vol. 5, no. SEP, p. 106, 2018. [CrossRef]
  122. J. Zubizarreta, I. Aguinaga, and A. Amundarain, “A framework for augmented reality guidance in industry,” The International Journal of Advanced Manufacturing Technology 2019 102:9, vol. 102, no. 9, pp. 4095–4108, Feb. 2019. [CrossRef]
  123. P. Wang et al., “A gesture- and head-based multimodal interaction platform for MR remote collaboration,” International Journal of Advanced Manufacturing Technology, vol. 105, no. 7–8, pp. 3031–3043, Feb. 2019. [CrossRef]
  124. J. J. Roldán, E. Crespo, A. Martín-Barrio, E. Peña-Tapia, and A. Barrientos, “A training system for Industry 4.0 operators in complex assemblies based on virtual reality and process mining,” Robot Comput Integr Manuf, vol. 59, pp. 305–316, Oct. 2019. [CrossRef]
  125. K. van Lopik, M. Sinclair, R. Sharpe, P. Conway, and A. West, “Developing augmented reality capabilities for industry 4.0 small enterprises: Lessons learnt from a content authoring case study,” Comput Ind, vol. 117, p. 103208, May 2020. [CrossRef]
  126. Z. Wang et al., “Information-level AR instruction: a novel assembly guidance information representation assisting user cognition,” International Journal of Advanced Manufacturing Technology, vol. 106, no. 1–2, pp. 603–626, Feb. 2020. [CrossRef]
  127. F. Pilati, M. Faccio, M. Gamberi, and A. Regattieri, “Learning manual assembly through real-time motion capture for operator training with augmented reality,” Procedia Manuf, vol. 45, pp. 189–195, Feb. 2020. [CrossRef]
  128. M. Eder, M. Hulla, F. Mast, and C. Ramsauer, “On the application of Augmented Reality in a learning factory working environment,” Procedia Manuf, vol. 45, pp. 7–12, Feb. 2020. [CrossRef]
  129. K. B. Borgen, T. D. Ropp, and W. T. Weldon, “Assessment of Augmented Reality Technology’s Impact on Speed of Learning and Task Performance in Aeronautical Engineering Technology Education,” vol. 31, no. 3, pp. 219–229, 2021. [CrossRef]
  130. S. S. V. K. Kolla, A. Sanchez, and P. Plapper, “Comparing software frameworks of Augmented Reality solutions for manufacturing,” Procedia Manuf, vol. 55, no. C, pp. 312–318, Feb. 2021. [CrossRef]
  131. M. Moghaddam, N. C. Wilson, A. S. Modestino, K. Jona, and S. C. Marsella, “Exploring augmented reality for worker assistance versus training,” Advanced Engineering Informatics, vol. 50, p. 101410, Feb. 2021. [CrossRef]
  132. D. Siyapong, P. Rodchom, and A. Eksiri, “The Virtual Reality Technology for Maintenance of Complex Machine in Manufacturing Training,” Srinakharinwirot University Engineering Journal, vol. 16, no. 3, pp. 37–52, Feb. 2021, [Online]. Available: https://ph02.tci-thaijo.org/index.php/sej/article/view/243447.
  133. F. G. Pratticò and F. Lamberti, “Towards the adoption of virtual reality training systems for the self-tuition of industrial robot operators: A case study at KUKA,” Comput Ind, vol. 129, p. 103446, Aug. 2021. [CrossRef]
  134. D. Ariansyah et al., “A head mounted augmented reality design practice for maintenance assembly: Toward meeting perceptual and cognitive needs of AR users,” Appl Ergon, vol. 98, p. 103597, Feb. 2022. [CrossRef]
  135. S. Aivaliotis et al., “An augmented reality software suite enabling seamless human robot interaction,” 2022. [CrossRef]
  136. F. M. Monetti, A. de Giorgio, H. Yu, A. Maffei, and M. Romero, “An experimental study of the impact of virtual reality training on manufacturing operators on industrial robotic tasks,” Procedia CIRP, vol. 106, pp. 33–38, Jan. 2022. [CrossRef]
  137. W. Yan, “Augmented reality instructions for construction toys enabled by accurate model registration and realistic object/hand occlusions,” Virtual Real, vol. 26, no. 2, pp. 465–478, Feb. 2022. [CrossRef]
  138. H. Yun and M. B. G. Jun, “Immersive and interactive cyber-physical system (I2CPS) and virtual reality interface for human involved robotic manufacturing,” J Manuf Syst, vol. 62, pp. 234–248, Jan. 2022. [CrossRef]
  139. R. Zhu, F. Aqlan, R. Zhao, and H. Yang, “Sensor-based modeling of problem-solving in virtual reality manufacturing systems,” Expert Syst Appl, vol. 201, p. 117220, Sep. 2022. [CrossRef]
  140. J. Geng et al., “A systematic design method of adaptive augmented reality work instruction for complex industrial operations,” Comput Ind, vol. 119, p. 103229, Feb. 2020. [CrossRef]
  141. M. Eswaran and M. V. A. Raju Bahubalendruni, “Augmented reality aided object mapping for worker assistance/training in an industrial assembly context: Exploration of affordance with existing guidance techniques,” Comput Ind Eng, vol. 185, p. 109663, Nov. 2023. [CrossRef]
  142. B. S. Tan, T. J. Chong, and Y. Y. Chew, “Usability Study of Augmented Reality Training Application for Can Manufacturing Company,” ICDXA 2024 - Conference Proceedings: 2024 3rd International Conference on Digital Transformation and Applications, pp. 27–32, 2024. [CrossRef]
  143. V. Holuša, M. Vaněk, F. Beneš, J. Švub, and P. Staša, “Virtual Reality as a Tool for Sustainable Training and Education of Employees in Industrial Enterprises,” Sustainability 2023, Vol. 15, Page 12886, vol. 15, no. 17, p. 12886, Aug. 2023. [CrossRef]
  144. N. N. Liyanawaduge, E. M. H. K. Kumarasinghe, S. S. Iyer, A. K. Kulatunga, and G. Lakmal, “Digital Twin & Virtual Reality Enabled Conveyor System to Promote Learning Factory Concept,” 2023 IEEE 17th International Conference on Industrial and Information Systems, ICIIS 2023 - Proceedings, pp. 85–90, 2023. [CrossRef]
  145. D. Scorgie, Z. Feng, D. Paes, F. Parisi, T. W. Yiu, and R. Lovreglio, “Virtual reality for safety training: A systematic literature review and meta-analysis,” Saf Sci, vol. 171, p. 106372, Mar. 2024. [CrossRef]
  146. S. Makris, P. Karagiannis, S. Koukas, and A. S. Matthaiakis, “Augmented reality system for operator support in human–robot collaborative assembly,” CIRP Annals, vol. 65, no. 1, pp. 61–64, Feb. 2016. [CrossRef]
  147. X. Wang, S. K. Ong, and A. Y. C. Nee, “Multi-modal augmented-reality assembly guidance based on bare-hand interface,” Advanced Engineering Informatics, vol. 30, no. 3, pp. 406–421, Feb. 2016. [CrossRef]
  148. R. Hanson, W. Falkenström, and M. Miettinen, “Augmented reality as a means of conveying picking information in kit preparation for mixed-model assembly,” Comput Ind Eng, vol. 113, pp. 570–575, Feb. 2017. [CrossRef]
  149. M. Hoover, S. Gilbert, and J. Oliver, “An evaluation of the Microsoft HoloLens for a manufacturing-guided assembly task.”.
  150. Y. Wang, S. Zhang, B. Wan, W. He, and X. Bai, “Point cloud and visual feature-based tracking method for an augmented reality-aided mechanical assembly system,” The International Journal of Advanced Manufacturing Technology 2018 99:9, vol. 99, no. 9, pp. 2341–2352, Feb. 2018. [CrossRef]
  151. D. Mourtzis, V. Zogopoulos, and F. Xanthi, “Augmented reality application to support the assembly of highly customized products and to adapt to production re-scheduling,” International Journal of Advanced Manufacturing Technology, vol. 105, no. 9, pp. 3899–3910, Feb. 2019. [CrossRef]
  152. J. C. Arbeláez, R. Viganò, and G. Osorio-Gómez, “Haptic Augmented Reality (HapticAR) for assembly guidance,” International Journal on Interactive Design and Manufacturing, vol. 13, no. 2, pp. 673–687, Feb. 2019. [CrossRef]
  153. W. Tao, Z.-H. Lai, and M. C. Leu, “Manufacturing Assembly Simulations in Virtual and Augmented Reality”.
  154. C. Y. Tsai, T. Y. Liu, Y. H. Lu, and H. Nisar, “A novel interactive assembly teaching aid using multi-template augmented reality,” Multimed Tools Appl, vol. 79, no. 43–44, pp. 31981–32009, Feb. 2020. [CrossRef]
  155. P. Horejsi, K. Novikov, and M. Simon, “A smart factory in a smart city: Virtual and augmented reality in a smart assembly line,” IEEE Access, vol. 8, pp. 94330–94340, 2020. [CrossRef]
  156. T. Masood and J. Egger, “Adopting augmented reality in the age of industrial digitalisation,” Comput Ind, vol. 115, p. 103112, Feb. 2020. [CrossRef]
  157. M. M. L. Chang, A. Y. C. Nee, and S. K. Ong, “Interactive AR-assisted product disassembly sequence planning (ARDIS),” vol. 58, no. 16, pp. 4916–4931, Aug. 2020. [CrossRef]
  158. C. H. Chu and C. H. Ko, “An experimental study on augmented reality assisted manual assembly with occluded components,” J Manuf Syst, vol. 61, pp. 685–695, Oct. 2021. [CrossRef]
  159. F. Schuster, B. Engelmann, U. Sponholz, and J. Schmitt, “Human acceptance evaluation of AR-assisted assembly scenarios,” J Manuf Syst, vol. 61, pp. 660–672, Oct. 2021. [CrossRef]
  160. Z. Wang et al., “M-AR: A Visual Representation of Manual Operation Precision in AR Assembly,” vol. 37, no. 19, pp. 1799–1814, 2021. [CrossRef]
  161. Z. Wang et al., “User-oriented AR assembly guideline: a new classification method of assembly instruction for user cognition,” International Journal of Advanced Manufacturing Technology, vol. 112, no. 1–2, pp. 41–59, Feb. 2021. [CrossRef]
  162. H. Atici-Ulusu, Y. D. Ikiz, O. Taskapilioglu, and T. Gunduz, “Effects of augmented reality glasses on the cognitive load of assembly operators in the automotive industry,” Int J Comput Integr Manuf, vol. 34, no. 5, pp. 487–499, 2021. [CrossRef]
  163. D. Gerhard, M. Neges, J. L. Siewert, and M. Wolf, “Towards Universal Industrial Augmented Reality: Implementing a Modular IAR System to Support Assembly Processes,” Multimodal Technologies and Interaction 2023, Vol. 7, Page 65, vol. 7, no. 7, p. 65, Jun. 2023. [CrossRef]
  164. S. Raj, L. R. D. Murthy, T. A. Shanmugam, G. Kumar, A. Chakrabarti, and P. Biswas, “Augmented reality and deep learning based system for assisting assembly process,” Journal on Multimodal User Interfaces, vol. 18, no. 1, pp. 119–133, Mar. 2024. [CrossRef]
  165. P. Gustavsson, A. Syberfeldt, and M. Holm, “Virtual reality platform for design and evaluation of human-robot collaboration in assembly manufacturing,” International Journal of Manufacturing Research, vol. 18, no. 1, pp. 28–49, 2023. [CrossRef]
  166. P. Trebuna, M. Pekarcikova, R. Duda, and T. Svantner, “Virtual Reality in Discrete Event Simulation for Production–Assembly Processes,” Applied Sciences 2023, Vol. 13, Page 5469, vol. 13, no. 9, p. 5469, Apr. 2023. [CrossRef]
  167. J. Wolfartsberger, R. Zimmermann, G. Obermeier, and D. Niedermayr, “Analyzing the potential of virtual reality-supported training for industrial assembly tasks,” Comput Ind, vol. 147, p. 103838, May 2023. [CrossRef]
  168. A. Kokkas and G. C. Vosniakos, “An Augmented Reality approach to factory layout design embedding operation simulation,” International Journal on Interactive Design and Manufacturing, vol. 13, no. 3, pp. 1061–1071, Feb. 2019. [CrossRef]
  169. H. Álvarez, I. Lajas, A. Larrañaga, L. Amozarrain, and I. Barandiaran, “Augmented reality system to guide operators in the setup of die cutters,” The International Journal of Advanced Manufacturing Technology 2019 103:1, vol. 103, no. 1, pp. 1543–1553, Feb. 2019. [CrossRef]
  170. D. D. L. Mascareñas et al., “Augmented reality for next generation infrastructure inspections,” Struct Health Monit, vol. 20, no. 4, pp. 1957–1979, Feb. 2021. [CrossRef]
  171. S. Meyer, “Augmented Reality in the Pharmaceutical Industry-A Case Study on HoloLens for Fully Automated Dissolution Guidance,” 2021.
  172. D. A. Zakoldaev, A. V Gurjanov, A. V Shukalov, and I. O. Zharinov, “Implementation of H2M technology and augmented reality for operation of cyber-physical production of the Industry 4.0,” J Phys Conf Ser, vol. 1353, no. 1, p. 12142, Feb. 2019. [CrossRef]
  173. S. Deshpande, S. Padalkar, and S. Anand, “IIoT based framework for data communication and prediction using augmented reality for legacy machine artifacts,” Manuf Lett, vol. 35, pp. 1043–1051, Aug. 2023. [CrossRef]
  174. C. Li, P. Zheng, Y. Yin, Y. M. Pang, and S. Huo, “An AR-assisted Deep Reinforcement Learning-based approach towards mutual-cognitive safe human-robot interaction,” Robot Comput Integr Manuf, vol. 80, p. 102471, Apr. 2023. [CrossRef]
  175. T. Schmitt, P. Viklund, M. Sjölander, L. Hanson, K. Amouzgar, and M. U. Moris, “Augmented reality for machine monitoring in industrial manufacturing: framework and application development,” Procedia CIRP, vol. 120, pp. 1327–1332, Jan. 2023. [CrossRef]
  176. P. C. Sorathiya, S. Anand Singh, and K. A. Desai, “Mobile-Based augmented reality (AR) module for guided operations of CNC surface roughness machine,” Manuf Lett, vol. 35, pp. 1255–1263, Aug. 2023. [CrossRef]
  177. L. Xia, J. Lu, Y. Lu, H. Zhang, Y. Fan, and Z. Zhang, “Augmented reality and indoor positioning based mobile production monitoring system to support workers with human-in-the-loop,” Robot Comput Integr Manuf, vol. 86, p. 102664, Apr. 2024. [CrossRef]
  178. R. Maio et al., “Pervasive Augmented Reality to support real-time data monitoring in industrial scenarios: Shop floor visualization evaluation and user study,” Comput Graph, vol. 118, pp. 11–22, Feb. 2024. [CrossRef]
  179. C. Liu, S. Cao, W. Tse, and X. Xu, “Augmented Reality-assisted Intelligent Window for Cyber-Physical Machine Tools,” J Manuf Syst, vol. 44, pp. 280–286, Jul. 2017. [CrossRef]
  180. B. K. , W. H. , L. L. Tshepo Godfrey Kukuni, “Augmented Reality In Smart Manufacturing: A User Experience Evaluation,” Webology, vol. Volume 19, no. No. 3, pp. 2405–2423, 2022, Accessed: Oct. 11, 2022. [Online]. Available: http://www.webology.org/abstract.php?id=3024.
  181. C. X. E. Shamaine, Y. Qiao, V. Kuts, J. Henry, K. McNevin, and N. Murray, “Teleoperation of the Industrial Robot: Augmented reality application,” MMSys 2022 - Proceedings of the 13th ACM Multimedia Systems Conference, vol. 22, pp. 299–303, Jun. 2022. [CrossRef]
  182. S. K. Ong, A. W. W. Yew, N. K. Thanigaivel, and A. Y. C. Nee, “Augmented reality-assisted robot programming system for industrial applications,” Robot Comput Integr Manuf, vol. 61, p. 101820, Feb. 2020. [CrossRef]
  183. H. Arnarson, B. Solvang, and B. Shu, “The application of virtual reality in programming of a manufacturing cell,” 2021 IEEE/SICE International Symposium on System Integration, SII 2021, pp. 213–218, Jan. 2021. [CrossRef]
  184. M. Khatib, K. Al Khudir, and A. De Luca, “Human-robot contactless collaboration with mixed reality interface,” Robot Comput Integr Manuf, vol. 67, p. 102030, Feb. 2021. [CrossRef]
Figure 1. Nine Pillars of Industry 4.0.
Figure 1. Nine Pillars of Industry 4.0.
Preprints 166623 g001
Figure 2. Reality-Virtuality (RV) Continuum.
Figure 2. Reality-Virtuality (RV) Continuum.
Preprints 166623 g002
Figure 3. Research Methodology for the Literature Review Based on [2].
Figure 3. Research Methodology for the Literature Review Based on [2].
Preprints 166623 g003
Figure 4. An example of a Maintenance Application using AR.
Figure 4. An example of a Maintenance Application using AR.
Preprints 166623 g004
Figure 5. Example of AR use for Assembly Instructions.
Figure 5. Example of AR use for Assembly Instructions.
Preprints 166623 g005
Figure 6. AR use for Operations.
Figure 6. AR use for Operations.
Preprints 166623 g006
Figure 7. Applications of AR/VR in Manufacturing Product Design.
Figure 7. Applications of AR/VR in Manufacturing Product Design.
Preprints 166623 g007
Figure 8. Categories of Quality Control in Manufacturing using AR/VR.
Figure 8. Categories of Quality Control in Manufacturing using AR/VR.
Preprints 166623 g008
Figure 9. Example of Defect Detection using a Virtual Environment System.
Figure 9. Example of Defect Detection using a Virtual Environment System.
Preprints 166623 g009
Figure 10. Reoccurring Important AR/VR Features.
Figure 10. Reoccurring Important AR/VR Features.
Preprints 166623 g010
Figure 11. Emerging technologies behind closed-loop AR/VR systems for manufacturing applications.
Figure 11. Emerging technologies behind closed-loop AR/VR systems for manufacturing applications.
Preprints 166623 g011
Figure 12. The proposed generalized architecture of implementing AR/VR applications using edge devices.
Figure 12. The proposed generalized architecture of implementing AR/VR applications using edge devices.
Preprints 166623 g012
Figure 13. Proposed architecture for AI-enabled AR/VR-based manufacturing applications.
Figure 13. Proposed architecture for AI-enabled AR/VR-based manufacturing applications.
Preprints 166623 g013
Figure 14. Features of digital twin applications.
Figure 14. Features of digital twin applications.
Preprints 166623 g014
Figure 15. Levels of virtualization used in manufacturing for digital twin applications.
Figure 15. Levels of virtualization used in manufacturing for digital twin applications.
Preprints 166623 g015
Figure 16. Proposed architecture for AR/VR-based digital twin applications for manufacturing.
Figure 16. Proposed architecture for AR/VR-based digital twin applications for manufacturing.
Preprints 166623 g016
Figure 17. Proposed generalized architecture for teleportation applications in manufacturing using AR/VR.
Figure 17. Proposed generalized architecture for teleportation applications in manufacturing using AR/VR.
Preprints 166623 g017
Figure 18. Proposed generalized system architecture for human-robot collaboration using AR/VR.
Figure 18. Proposed generalized system architecture for human-robot collaboration using AR/VR.
Preprints 166623 g018
Figure 19. Analysis of research papers on applications and emerging technologies by year.
Figure 19. Analysis of research papers on applications and emerging technologies by year.
Preprints 166623 g019
Figure 20. Statistics of the usage of AR/VR devices by year.
Figure 20. Statistics of the usage of AR/VR devices by year.
Preprints 166623 g020
Figure 21. Statistics of the usage of tracking technologies by year.
Figure 21. Statistics of the usage of tracking technologies by year.
Preprints 166623 g021
Figure 22. Challenges of applying detection algorithms with immersive technologies of manufacturing applications.
Figure 22. Challenges of applying detection algorithms with immersive technologies of manufacturing applications.
Preprints 166623 g022
Table 1. Research questions and the objectives of the research questions.
Table 1. Research questions and the objectives of the research questions.
Research Question Objective
RQ1: What are the current applications of AR/VR in manufacturing? This research question aims to identify and understand the current trends in the applications and implementations of Augmented Reality (AR) and Virtual Reality (VR) technologies in manufacturing sectors.
RQ2: What are the state-of-the-art technologies used in AR/VR applications in manufacturing? The main objective of this research question is to gather the latest hardware and software used in AR/VR technologies for manufacturing applications.
RQ3: What are the emerging technologies in the field of AR/VR applications in manufacturing? This research question seeks to identify the emerging technologies that are driving innovation in the field of AR/VR application within manufacturing. The goal is to understand the technological trends in modern manufacturing practices.
RQ4: What are the challenges for the adaptation of AR/VR applications in manufacturing? The main idea of this research question is to address the challenges and difficulties of implementing AR/VR-based applications in the context of the manufacturing industry.
Table 2. List of search strings.
Table 2. List of search strings.
# Search String Search Platform
1 ‘AR’ AND ‘Manufacturing’ Web of Science
Google Scholar
2 ‘Augmented Reality’ AND ‘Manufacturing’
3 ‘Virtual Reality’ AND ‘Manufacturing’
4 ‘VR’ AND ‘Manufacturing’
5 ‘Augmented Reality’ AND ‘Industry 4.0’
6 ‘AR’ AND ‘Industry 4.0’
7 ‘Virtual Reality’ AND ‘Industry 4.0’
8 ‘VR’ AND ‘Industry 4.0’
9 ‘Augmented Reality’ AND ‘Factory’
10 ‘AR’ AND ‘Factory’
11 ‘Virtual Reality’ AND ‘Factory’
12 ‘VR’ AND ‘Factory’
Table 3. Statistics of the collected Articles.
Table 3. Statistics of the collected Articles.
Search Platform Search Statistics
Web of Science, Google Scholar Paper retrieved 297
Irrelevant Paper 50
Duplicate Entries 36
Table 4. Quality control applications in manufacturing using AR/VR.
Table 4. Quality control applications in manufacturing using AR/VR.
Research Group Application Type of Quality Control
[23] Quality assessment of polished surface Metrological inspection
[74] PCBA Inspection In-line process monitoring
[75] Predictive Maintenance Lean operation
[62] Quality inspection In-line process monitoring
[22] Virtual defect detection In-line process monitoring
[76] Car panel alignment Metrological Inspection
[77] Automatic aviation connectors inspection In-line process monitoring
[78], [79] Quality control of car body In-line process monitoring
[80] Non-destructive condition monitoring of IGBT wafer In-line process monitoring
[81] Identification of bottlenecks of a production line Lean operation
Table 5. Overview of AR/VR hardware and Software used for manufacturing applications.
Table 5. Overview of AR/VR hardware and Software used for manufacturing applications.
Research Group Mounting Type AR/VR Glass Type Tracking Type Software AR/VR
Maintenance [82] HMD HTC Vive Pro N/A Unity, Virtual Reality Toolkit, Azure Speech SDK VR
[75] HMD HoloLens Marker-based and feature-based Unity, Vuforia, Python 3.7 for fault prediction AR
[78] HMD HoloLens Feature-based Unity AR
[79] HHD Mobile Marker-based Unity, Vuforia AR
[8] HHD Ipad Air Feature-based iOS, Metaio SDK 5.5 AR
[60] HHD Tablet Not Mentioned Unity, Vuforia AR
[80] Desktop/Laptop, HMD, HHD Goggles, laptop PC, mobile device, Vuzix star 1200xl Marker-based Unity, Vuforia AR
[24] Projector DLP projector Benq W1080ST+ Marker-based Unity, ARToolkit AR
[81] HHD Android Device Model Target Unity, Vuforia AR
[82] HHD Not Mentioned Not Mentioned Unity, Vuforia AR
[83] HMD Not Mentioned Head-gaze Unity, Vuforia AR
[84] HHD Samsung Tablet Model Target Unity, Vuforia AR
[85] Desktop/Laptop Not Mentioned N/A Unity, AvatarSDK, Virtual Reality Toolkit, Photon Unity Networking VR
[86] HHD Samsung Galaxy S10, Iphone 11 Feature-based YOLOv5, Roboflow, VoTT, PyTorch, Ffmpeg AR
[87] Not mentioned Not Mentioned Not Mentioned FreeCAD AR
[88] HMD HoloLens Not Mentioned Not Mentioned AR
[61] HMD, Desktop/Laptop HMD, Desktop/Laptop Marker-based Unity, Vuforia AR
[88] HHD Tablet Model Target Tacking Unity AR
[89] HMD Headset Not Mentioned Unity, Vuforia AR
[90] HMD Headset N/A N/A AR
[91] HMD Headset Not Mentioned Unity AR
[92] Desktop/Laptop Desktop Not Mentioned Unity AR
[93] HMD Headset Feature Based Unity AR
Design [90] Projector IR-RGB Dual-Input Projector Not Mentioned Not Mentioned AR
[91] HMD HoloLens Not Mentioned Unity AR
[92] Projector three-walled immersive projection environment called METaL N/A Siemens PLM Software VR
[20] N/A N/A N/A Double Diamond design process model AR
[93] HMD, Desktop/Laptop, HHD HoloLens, PC, Andriod Tablet Not Mentioned Unity, Mixed Reality Toolkit, Vuforia AR
[19] HMD HoloLens Model Target Unity, Vuforia, PiXYZ Unity Plugin AR
[12] HMD HoloLens Feature-based Unity, Vuforia, Mixed Reality Toolkit AR
[94] HHD Android Mobile Maker ANSYS, Vuforia Android Application AR
[95] HMD HTC Vive VR Headset N/A Unity, PUN2 Library VR
[96] HMD Oculus Quest VR Headset N/A Unreal Engine VR
[97] HMD Oculus Rift S VR Headset N/A Flexsim VR
[101] HMD Not Mentioned Feature Based Vuforia AR/VR
[102] HMD HTC Vive N/A Not Mentioned VR
[103] HMD Not Mentioned N/A Unity VR
Quality Control [98] HMD Vuzix Wrap 920AR Marker-based OpenSceneGraph, ARToolkit AR
[58] HHD Tablet PC Marker-based Unity, Vuforia AR
[99] Not Mentioned Not Mentioned N/A Unity VR
[100] HHD Lenovo Phab 2 Pro Smartphone Marker-based Unity, Google Project Tango Development Kit AR
[23] HMD HoloLens Feature-based Unity, Mixed Reality Toolkit AR
[101] HMD HoloLens Marker-based Unity AR
[69] HMD HoloLens Feature-based Unity, Mixed Reality Toolkit AR
[102] HMD HP Reverb VR Pro N/A Unity VR
[57] HHD Samsung Galaxy Tab S4 Marker-based Unity, Google ARCore SDK AR
[103] HMD HoloLens Marker-based Unity, Mixed Reality Toolkit AR
[29] HMD, HHD HoloLens, Andriod Mobile Device, Moverio BT300 Marker-based Unity, ARCore, Google Tango, ARToolKit AR
[104] HHD Phone/ Tablet Feature-based Mask-RNN Algorithm AR
[36] HMD, HHD HoloLens, Samsung S7 and Samsung Galaxy Tab Location-based Unity AR
[59] HMD HoloLens 2 Marker-based Unity, MobileNet-v2 AR
[73] HMD HoloLens 2 Feature-based Blender, Siemens NX AR
[71] HMD, HHD Not mentioned Marker-based Unity, Vuforia AR
[74] HMD HoloLens Marker-based Unity, Mixed Reality Toolkit AR
[72] Desktop/Laptop Not Mentioned Feature-based YoloV3 AR
Application Type Research Group Mounting Type AR/VR Glass Type Tracking Type Software AR/VR
Quality Control [75] Desktop/Laptop Desktop/Laptop Feature-based Not Mentioned AR
[76] HHD Apple Smartphones and Tablets Feature-based iOS, Apple ARKit AR
[57] HHD Samsung Galaxy Tab S4 Marker-based Unity, ARCore AR
[22] Desktop/Laptop PC N/A Unreal Engine 4 VR
[105] HHD Mobile Feature-based Unity, Microsoft Azure, ARKit AR
[112] HHD/HMD Not Mentioned Feature-Based Unity, ARRaycaset AR
Training [106] HMD HTC Vive N/A Unity, Steam VR AR
[15] Desktop/Laptop Personal Computer Marker-based ARToolkit, Solid edge for CAD Modeling, Optical Flow AR/VR
[68] HMD HoloLens Feature-based Windows 10 system, Visual Studios 2017 Community AR
[107] HHD Leap Motion Controller Marker-based Unity, Vuforia, NX AR/VR
[30] HHD Mobile Marker-based Unity, Vuforia, SolidWorks, 3DSMax AR
[98] HMD Vuzix Wrap 920AR Marker-based OpenSceneGraph, ARToolkit AR
[8] HHD Ipad Air Feature-based iOS, Metaio SDK 5.5 AR
[60] HHD Tablet Not Mentioned Unity, Vuforia AR
[81] HHD Android Device Model Target Unity, Vuforia AR
[108] HMD Headset, Smart Glass Marker-based Unity, Vuforia AR/VR
[58] HHD Tablet PC Marker-based Unity, Vuforia AR
[82] HHD Not Mentioned Not Mentioned Unity, Vuforia AR
[109] HMD HoloLens Not Mentioned Not Mentioned AR
[110] HHD Mobile Model Target Tacking OpenCV, C++ AR
[111] HMD HTC Vive Not Mentioned Unity, OpenCV AR
[112] HMD HTC Vive N/A Unity, SteamVR VR
[30] HHD Mobile Marker-based Unity, Vuforia AR
[68] HMD HTC Vive N/A Unity, SteamVR VR
[113] HMD HoloLens Gaze marker-based Unity, Mixed Reality Toolkit AR
[114] Projector VPL-DX271 Projector Feature-based Not Mentioned AR
[115] Desktop/Laptop Monitor Feature-based Not Mentioned AR
[116] HHD Mobile Marker-based Vuforia AR
[117] HMD HoloLens Not Mentioned Unity AR
[118] HMD, HHD Android device, HoloLens Marker-based, Feature-based Unity, Vuforia, Mixed Reality ToolKit AR
[119] HMD HoloLens Not Mentioned Unity, Mixed Reality Toolkit AR
[120] HMD ACER Windows Mixed Reality HMD N/A Unity VR
[121] HMD HTC Vive Pro N/A Unity, SteamVR VR
[122] HMD HoloLens 2 Model Target Tracking Unity, Vuforia version 8.3.8 object tracking AR
[123] HMD HoloLens 2 Marker-based Unity, Mixed Reality Toolkit AR
[124] HMD HTC Vive VR headset N/A Unity VR
[56] HMD HoloLens 2 Marker-based Unity AR/VR
[125] HHD Iphone XS Marker-based ARKit AR
[126] HMD, Desktop/Laptop Samsung Odyssey+ VR Headset N/A Unity VR
[127] HMD HTC Vive VR Headset N/A Unity, Tobii eye-tracking technology VR
[128] HMD Techlens T2 Feature-based Unity, EasyAR, OpenCV AR
[41] HMD HoloLens Feature-based Unity, Vuforia AR
[141] HMD HoloLens 2 Feature-Based Unity AR
[137] HHD Samsung Galaxy Tab 3 Marker-based Unity, Vuforia AR
[138] HMD HTC Vive N/A Unity VR
[139] HMD HTC Vive Cosmos N/A Unity VR
[140] HMD Not Mentioned N/A Not Mentioned VR
Assembly [129] HHD Android Smartwatch Marker-based Unity, Vuforia AR
[130] HHD LCD Screen Marker-based ARToolkitplus AR
[131] HMD HoloLens Not Mentioned Not Mentioned AR
[24] Projector DLP projector Benq W1080ST+ Marker-based Unity, ARToolkit AR
Application Type Research Group Mounting Type AR/VR Glass Type Tracking Type Software AR/VR
Assembly [135] HMD, HHD HoloLens, Tablet Marker-based Unity, Mixed Reality Toolkit AR
[15] Desktop/Laptop Personal Computer Marker-based ARToolkit AR
[136] Desktop/Laptop Desktop PC Feature-based Unity AR
[151] HMD, HHD Mobile, Head Mounted Display Marker-based Unity, Vuforia AR
[138] Not Mentioned Not Mentioned Feature-based Not Mentioned AR
[139] Desktop/Laptop Personal Computer Marker-based Unity, Vuforia AR/VR
[140] HHD Mobile Marker-based Unity, ARToolKit AR
[141] HMD Not Mentioned Marker-based Unity AR/VR
[142] HMD HoloLens Not Mentioned Unity, Holotoolkit AR
[143] Desktop/Laptop Laptop Marker-based Unity, Vuforia AR
[43] Desktop/Laptop Monitor Marker-based Unity AR
[144] HHD Mobile Device Model Target Tacking Unity, Vuforia AR
[145] HMD HoloLens Marker-based Unity, Vuforia, Mixed Reality Toolkit AR
[146] Projector VPL-DX271 projector Feature-based Not Mentioned AR
[147] Projector VPL-DX271 projector Feature-based Not Mentioned AR
[148] HMD Sony Smart Eyeglass Sed-E1 Marker-based Not Mentioned AR
[158] Projector Not Mentioned N/A Unity AR
[159] HMD HoloLens 2 Feature-based Unity, Vuforia AR
[90] HHD Not Mentioned Marker-based Unity, Vuforia AR
[160] Desktop/Laptop Emulator N/A Unity, ViCor VR
[161] HMD Oculus Rift N/A Not Mentioned VR
[162] HMD HTC Vive N/A Unity VR
Operation [100] HHD Lenovo Phab 2 Pro smartphone Marker-based Google Project Tango Development Kit AR
[149] HHD iPhone 7 Marker-based Unity, Vuforia, ARKit AR
[150] Projector Not Mentioned Feature-based Not Mentioned AR
[151] HMD HoloLens Marker-based Unity AR
[152] HMD HoloLens Marker-based Unity, Vuforia AR
[153] HHD Tablet Not Mentioned No Mentioned AR
[168] HHD iPhone Marker-based Not Mentioned AR
[169] HMD HoloLens 2 Feature-based Unity, Vuforia AR
[170] HMD HoloLens 2 Not Mentioned Unity AR
[171] HHD Not Mentioned Marker-based Unity, Vuforia AR
[172] Not Mentioned Not Mentioned Feature-based Not Mentioned AR
[173] HMD HoloLens 2 Not Mentioned Not Mentioned AR
Table 7. AR/VR-assisted AI-based manufacturing applications.
Table 7. AR/VR-assisted AI-based manufacturing applications.
Research Group Application AI method AI Results
[41] Task assistance Mask R-CNN Significantly outperforms the AR marker-based approach
[43] Work instruction Tuned R-CNN Achieved a mean Average Precision of 84.7%
[86] Task assistance YOLOv5 high precision and real-time performance achieving prediction times of approximately 0.007 seconds
[104] Machine inspection Mask R-CNN Mask R-CNN achieved 70% marker detection accuracy, with 100% accuracy for untrained machines.
[45] Markerless pose estimation CNN (VoteNet) Not mentioned (accuracy depending on sensor systems and data quality)
[63] Quick Development Toolkit for AR YOLOv5 Only retains the objects with more than 70% detection confidence
[78] Maintenance application PSO-CNN
Validation accuracy reaches 97.63%
[56] Human-robot collaboration Mask R-CNN Not mentioned
[55] Human-robot collaboration RL success rate of reaching tasks is approximately 98.7%
[75] Predictive maintenance CNN-LSTM Mean Absolute Error (MAE) 0.4257% and Root Mean Square Error (RMSE) 0.4505% and the
Running time 293.3658 seconds
[64] Pose estimation MobileNetv2 Not mentioned
[116] Process monitoring One-Class SVM confidence score between 0 and 1 for each data point
Table 9. AR/VR-assisted teleportation and remote collaboration for manufacturing applications.
Table 9. AR/VR-assisted teleportation and remote collaboration for manufacturing applications.
Research Group Application Research Focus
[59] Human-robot collaboration Haptic constraints in a point cloud augmented virtual reality environment
[58] Work instruction Visual annotation for image and video-based remote collaboration
[113] Quality control Remote metrological inspection using haptic feedback of manufacturing process
[56] Human-robot collaboration Improvement of the performance in manipulation and grasping tasks
[103] Product design Feasibility study of developing VR system to reduce environmental impact
[60] Human-robot collaboration Collaborative manufacturing using digital twin, RL algorithm, and AR
[82] Maintenance application Asynchronous communication for maintenance method development and documentation creation
[104] Human-robot collaboration Virtual commissioning of production line
[181] Human-robot collaboration Development of remote collaboration platform
Table 10. AR/VR-assisted human-robot collaboration for manufacturing applications.
Table 10. AR/VR-assisted human-robot collaboration for manufacturing applications.
Research Group Technology Research Focus
[53] VR Human-robot simulation with virtual chatbot
[54] AR Closed loop HRC with a compensation mechanism
[41] AR Deep learning-based task assistance
[113] AR/VR Teleportation application using AR, VR, and haptic feedback
[184] VR Multisensory constrained and contactless coordinated motion task
[45] AR Distributed automation architecture and markerless pose estimation for robot
[135] AR Robot programming, safety, and task assistance
[61] AR Deep learning and digital twin-based safety aware system for human-robot collaboration
[60] AR RL and digital twin-based collaborative manufacturing application
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated