Preprint
Article

This version is not peer-reviewed.

A Serious Game for Upper Limb Rehabilitation Implementing a Custom Vibrotactile Wireless Wearable Device and Leap Motion

Submitted:

18 February 2026

Posted:

26 February 2026

You are already at the latest version

Abstract
Over the past decade, Serious Games (SG) and Immersive Virtual Reality (IVR) have gained increasing interest in rehabilitation. However, in IVR the Head-Mounted Displays (HMDs) introduce limitations such as nausea, eye fatigue, and accessibility constraints. As an alternative, low-immersion games using standard monitors can be employed, though they sacrifice spatial correspondence during user interaction with virtual objects. To address it, this paper presents the development of a SG for upper limb (UL) rehabilitation, incorporating a custom wireless wearable device with vibrotactile haptic feedback to restore spatial correspondence. By combining Leap Motion controller (LMC) based on hand tracking, the system enables natural movement interaction in a closed kinematic chain, offering a viable compromise between immersion and usability. Additionally, three virtual scenarios were developed to train pronation/supination, pinch grip, ulnar/radial deviation, as well as wrist, elbow and phalange flexion/extension. User experience (short AttrakDiff), workload (NASA-RTLX), usability (SUS scale), and functionality were evaluated in healthy participants divided into two groups. Group 1 (n=13) used only LMC, while Group 2 (n=9) used LMC and the wearable device. The results shown that the system was perceived as more functional in Group 2, in addition, an increase in usability (from 74.71 to 80.83) and improvements in feedback, movement precision and quick response were observed in this group. These findings indicate that the wearable device signicantly improves spatial correspondence during interaction, making the system a promising option for motor rehabilitation in desktop VR enviroments.
Keywords: 
;  ;  

1. Introduction

The UL plays a fundamental role in human autonomy, as its joints enable the execution of essential movements required for daily activities. Due to its continuous use, the UL is highly susceptible to injuries and pathologies that may compromise motor function, significantly impacting an individual’s independence. Motor disability of the UL primarily arises from musculoskeletal disorders (such as fractures, carpal tunnel syndrome, arthritis, and epicondylitis) and neurological conditions (including stroke, cerebral palsy, spinal cord injury, multiple sclerosis, and Parkinson’s disease) [1,2].
According to the World Health Organization’s Rehabilitation Need Estimator, until the year 2021, approximately 440 million prevalent cases of fractures were reported globally, with nearly 38 million occurring in the Americas. In the case of cerebral palsy, an estimated 63 million individuals were affected worldwide, including 3.7 million in the Americas. Stroke exhibited a global prevalence of 51 million cases, with 1.9 million in the same region. Regarding osteoarthritis, 370 million cases were documented globally, of which 31 million were located in the Americas [3].
Neurodegenerative diseases also represent a significant healthcare burden. Parkinson’s disease affects approximately 5.4 million individuals globally, including 300,000 in the Americas. Similarly, multiple sclerosis accounts for 1.5 million cases worldwide, with 82,000 cases reported in the Americas [3]. Although the WHO Estimator does not provide specific details regarding impairments of the upper limb, several studies have reported that fractures in this region are common across different demographic groups, with the radius and ulna being the most frequently affected bones [4,5,6,7]. Stroke, in turn, is one of the leading causes of functional impairment of the upper limb, with approximately 35% of patients experiencing weakness during the acute phase and 22% presenting with mild to moderate weakness [8]. In the case of pediatric cerebral palsy, 83% of cases involve upper limb impairment, and 69% of those present limitations in manual control [9].
Given the impact of these conditions on the upper limb, rehabilitation plays a crucial role in restoring patients’ independence and quality of life [10]. Therapeutic interventions focus on recovering movement, coordination, and strength through guided repetitive exercises during physical therapy sessions [11]. However, conventional rehabilitation faces challenges such as the lack of patient engagement and motivation due to the monotonous nature of therapy sessions [12], highlighting the need for appropriate strategies that promote active participation and foster patient interest in the rehabilitation process.
In the medical field, Virtual Reality (VR) is used to reduce costs and enhance quality across various applications, including diagnosis [13], education [14,15,16,17,18], rehabilitation [19,20], and telemedicine [21]. Among these, rehabilitation has emerged as one of the most prominent areas of application, second only to education. Within the context of rehabilitation, VR has proven to be a valuable tool in motor rehabilitation, offering interactive environments that promote the repetitive execution of therapeutic exercises in a more engaging and motivating manner. In particular, the integration of SG into VR-based rehabilitation has proven especially effective. By incorporating gamification elements that stimulate interest and commitment, SG not only enhances the overall user experience but also improves treatment adherence. For instance, their implementation has been associated with improved UL function and increased patient participation [22].
On the other hand, the design of SG aimed at UL rehabilitation must consider the specific movements and motor skills targeted during therapy. In this regard, Kai-Lun Liao et al [23] developed three mini-games, each focused on a particular type of UL movement. The purpose of this system is to encourage patient engagement through the performance of interactive tasks that promote treatment adherence. Moreover, authors implemented the user-centered design approach, which is essential to identify and address the specific requirements of the target population. This strategy enables the customization of virtual environment (VE) content, the adaptation of game mechanics, and the delivery of meaningful feedback, thereby meeting therapeutic needs without compromising the quality of the user experience [24,25].
In addition, natural interaction devices serve as key tools for integrating VR and SG into upper limb rehabilitation. Natural User Interfaces enable intuitive interpretation of human body movements through depth sensors and specialized cameras, as found in devices such as the Kinect and LMC [26]. These technologies capture the user’s natural motion, facilitating smoother interaction with VE specifically designed for therapeutic rehabilitation.
The integration of the LMC into rehabilitation systems has shown multiple benefits, particularly in improving the accuracy of upper limb gesture acquisition and enhancing user immersion during motor tasks [27,28]. For example, Ángela Aguilera-Rubio et al. [27] developed a virtual reality intervention protocol based on activities of daily living, supplemented with conventional rehabilitation exercises. In this context, the LMC was employed as the primary interaction device to promote motor function recovery in patients with post-stroke sequelae. The results of these studies underscore the feasibility and therapeutic potential of the LMC as a tool for VR-based rehabilitation.
In recent years, multiple SG have been specifically designed for upper limb rehabilitation using the LMC. These developments aim to improve motor skills through the controlled execution of targeted movements, offering an engaging and complementary alternative to traditional physical therapy. For example, the study by Cuesta-Gómez et al. [29], in which a game composed of six interactive scenarios was developed to enhance coordination and dexterity in patients with multiple sclerosis. These scenarios replicate conventional exercises—such as wrist flexion and extension—within an immersive VE that fosters patient engagement.
Similarly, Cabrera Hidalgo et al. [30] developed a video game using the Unity graphics engine, focused on enhancing hand–eye coordination and fine motor skills in children. The high precision of the LMC sensor enabled detailed tracking of phalangeal movements, as well as palm and wrist positioning, thus providing enriched motor feedback. Another work is the RehabHand system, proposed by A. de los Reyes-Guzmán et al. [28], which comprises seven VR applications designed to stimulate manipulative skills in patients with spinal cord injuries. These applications enable the execution of motor tasks through precise, real-time tracking of hand movements captured by the LMC.
On the other hand, adopting a playful-musical approach, Shah et al. [31] developed a VE inspired by the video game Guitar Hero, aimed at improving fine motor skills of the upper limb. The game engages patients through music playback, during which musical notes appear and must be reached using specific gestures. The system recognizes movements such as wrist extension and hand opening, using the initial position of the fist as a reference point. Additionally, it incorporates rhythmic auditory feedback to reinforce the association between stimuli and motor responses. Prior to each session, an individualized calibration process is conducted to adapt the system to the patient’s range of motion, thereby ensuring a personalized and therapeutically effective experience.
For their part, Mukhopadhyay et al. [32] proposed a series of interactive games combining the use of Kinect and Leap Motion Controller to enhance upper limb movement coordination and precision. These games aim to transform repetitive exercises into engaging activities, reducing monotony and promoting treatment adherence. Finally, Maldonado et al. [33] developed a serious game using Oculus Rift and Leap Motion Controller, designed for the rehabilitation of children with cerebral palsy. The game consists of a VE in which patients must identify and hit objects by launching virtual balls from their hands. During each session, the system records detailed performance metrics, including the number of throws, movement accuracy, and total activity time. These data allow therapists to individually monitor patient progress and dynamically adjust therapeutic strategies.
These studies highlight the effectiveness of the LMC as a natural interaction device in the design of serious games for upper limb rehabilitation. Its ability to accurately capture hand movements, combined with its integration into virtual environments, enables the development of innovative, personalized, and engaging therapeutic experiences.
Although some of the aforementioned developments incorporate immersive VR, most choose for desktop VR systems, where interaction primarily occurs through a monitor. This choice aims to mitigate side effects commonly associated with immersive VR, such as fatigue, nausea, and dizziness, which may arise during or after use, and in some cases, persist for days or even weeks, potentially hindering the adoption of this technology [34,35].
However, desktop VR lacks the ability to provide peripheral imagery, as visualization is restricted to the screen width. As a result, this limitation can affect spatial correspondence, i.e., the relationship between the user’s physical movements and their representation within the VE. This correspondence is essential to ensure that real-world actions are accurately reflected in the virtual world, enabling more natural and effective interaction.
It has been demonstrated that haptic feedback enhances the sense of immersion in virtual environments. This type of feedback is generally classified into two main categories: kinesthetic and tactile. Kinesthetic feedback provides information regarding the user’s body position and movement, with force feedback being one of its most common applications [36].
Tactile feedback, on the other hand, can be divided into three subcategories: electrotactile, thermal, and vibrotactile. Vibrotactile feedback employs vibratory actuators or motors in contact with the user’s skin to deliver sensory information. Since electrotactile and thermal feedback can potentially cause irritation or discomfort, vibrotactile feedback is the most widely used in VR applications [37].
This work presents the development of an upper limb rehabilitation system that integrates a virtual reality tool based on SG, an optical hand-tracking device LMC, and a custom wearable device with vibrotactile feedback. This integration enables users to perform closed kinematic chain movements, which contributes to improving spatial correspondence within the VE. Moreover, the environment was designed following User-Centered Design principles, allowing for the creation of a first functional prototype of the system, including both the VE and the vibrotactile device. Subsequently, a preliminary evaluation was conducted with healthy users, focusing on functionality, usability, and mental workload.
It is worth mentioning that, unlike previous developments lacking haptic feedback, such as those proposed by Aguilera-Rubio et al. [27], De los Reyes-Guzmán et al. [28], Shah et al. [31], and Mukhopadhyay et al. [32], this work implements vibrotactile haptic feedback with the aim of enhancing user interaction with the VE by enabling the execution of closed kinematic chain movements and improving the spatial correspondence between real and virtual actions.
This article is organized as follows: Section 2 presents the materials and methods used for the development of the VR system, including the collection of user requirements, the design of virtual scenarios, and the development and integration of the custom wearable device. Section 3 describes the system evaluation protocol with preliminary experiments. Section 4 reports the results related to user experience, system usability, and mental workload. Section 5 discusses the findings and conclusions of the study.

2. Materials and Methods

The system integrates a VE designed with gamification principles to promote UL rehabilitation through engaging tasks. It incorporates a wearable device that delivers vibrotactile feedback and uses the LMC for optical hand tracking, enhancing user interaction and spatial correspondence within the VE. Additionally, this work was carried out in four stages: i) identification of UL rehabilitation movements, for which key motions were gathered through questionnaires administered to physical therapists at the State Center for Rehabilitation and Special Education (CEREE) in Toluca, Mexico; ii) the design and development of the virtual scenarios and the wearable vibrotactile feedback device; iii) the integration of the custom wearable device into the virtual environment; iv) and evaluation of the user experience with the system.

2.1. User Requirements

To define the exercises implemented in the VE, a survey was conducted with 20 physiotherapists from the CEREE. The survey was divided into two sections: the first gathered information on key aspects of conventional rehabilitation, including the most common conditions treated, session frequency and duration, and the rehabilitation techniques used. The second section focused on the use and perception of VR tools in clinical practice. Therefore, the results of the survey indicated that the most frequently treated patients had fractures or sequelae of cerebrovascular disease affecting the UL. Other reported diagnoses included burns, multiple sclerosis, and carpal tunnel syndrome. Less frequently, cases of rheumatoid arthritis, brachial plexus injury, radiculopathies, rotator cuff tendinopathy, Guillain-Barré syndrome, and painful shoulder were also noted. Then, based on this information, the exercises for the virtual scenarios were designed.
In addition, the specialists mentioned that the duration and frequency of the sessions depend on the patient and their diagnosis. Generally, a conventional session lasts between 45 and 60 minutes, with a frequency of 2 to 3 sessions per week. In virtual reality therapy, sessions are reduced to 30 minutes. Initially, 10 sessions are scheduled, and the number is adjusted according to the patient’s progress. Breaks, lasting between 5 and 15 seconds, are set based on the patient’s pain tolerance and fatigue.

2.2. Virtual Environment Design

During the development of the rehabilitation system, various tools and devices were used. The Unreal Engine (UE) version 4.27 game engine was employed for designing the virtual environments and programming the task mechanics. This platform allows block-based programming through the use of Blueprints, and as open-source software, it has extensive documentation and an active community that facilitates problem-solving and access to shared tools such as plugins. Moreover, the “Ultraleap Hand Tracking Plugin” was used to link the LMC device with the virtual environment, as it serves as the main input device for user-interface interaction.
The LMC version one device was used as the interaction interface with UE. According to the manufacturer’s specifications, when placed on a desk, its tracking area has an inverted pyramid shape with a range of up to 60 cm [38]. However, this position limits tracking effectiveness, as it forces the user to keep their arm raised, causing fatigue when seated at the desk. For this reason, the device position was reversed, mounting it on an elevated base that allows a greater tracking range and expands the interaction area limits. Empirically, it was determined that the actual range, without significant loss in tracking relevant hand points, is approximately 80 cm. It is worth noting that its precision is 0.2 mm during static tracking and decreases to 1 mm during dynamic tracking [39].
Figure 1 shows the general diagram of the proposed system’s operation. The user must place their hand below the LMC device to allow tracking of the performed movements, which are displayed on the computer screen running the VE that provides visual and auditory feedback, and based on the user’s actions (such as collisions with objects or error detection), it also delivers vibrotactile feedback.
It should be mentioned that the design of the VE included a user interface and three virtual scenarios, each with three levels of difficulty and a specific task mechanic.

2.2.1. Graphical User Interface (GUI)

According to the user-centered design approach, the VE is intended for two types of users: physical therapists and patients requiring upper limb rehabilitation.
The GUI follows the workflow described below:
1.
Login: The physical therapist enters their username and password.
2.
Patient registration: A new patient is registered (name, age, sex, diagnosis), or an existing patient is selected. Clinical observations can also be added.
3.
Settings: Parameters such as game mode, sound, and the selection of scenarios and difficulty levels are configured.
4.
Game mode selection:
a)
Both hands: The game starts with the right hand and switches to the left hand halfway through the level.
b)
Single hand: The UL to be used for the exercises is selected.
5.
Scenario and level selection: The user can choose from four gameplay options:
a)
By game: A single scenario with its three difficulty levels.
b)
By level: All three scenarios at a single difficulty level.
c)
Game and level: A specific scenario and desired difficulty level.
d)
Play all: All scenarios and all levels.
6.
Tutorial and virtual scenarios execution: Before starting the game, users watch five short video tutorials. The first is a general overview shown before any scenario begins. The remaining four explain how to perform the exercises for each scenario. Bubbles includes two tutorials: one for the easy level (12 seconds) and another for the intermediate and advanced levels (13 seconds). The Maze tutorial lasts 24 seconds, and the Fish tutorial lasts 41 seconds. After each tutorial, the corresponding scenario starts based on the selected settings.
7.
End of session: A summary is displayed showing the patient’s name, ID, number of correct actions, errors, and session duration.

2.2.2. Virtual Scenarios

Based on the survey, three virtual scenarios were defined, each with three levels of difficulty, designed to fulfill the movement requirements of conventional rehabilitation for UL:
1.
Bubbles: Set in a flower field, bubbles of varying sizes and colors appear. In the easy level, bubbles float horizontally; in the intermediate and advanced levels, they fall vertically like raindrops. Each level involves different tasks:
a)
Easy: Users must pop bubbles (large or small) as instructed, using a pinch gesture, thumb to index finger. Successfully popping bubbles increases the score. Movements include metacarpophalangeal flexion and slight interphalangeal flexion, which are commonly employed in UL rehabilitation exercises.
b)
Intermediate and Advanced: Users collect bubbles and sort them by color into containers. The forearm must be supinated to catch a bubble and pronated to release it. Movements include forearm pronation/supination. Difficulty increases with the number of bubbles (two in intermediate, four in advanced) and the speed of appearance.
2.
Maze: The scene features a 3D maze on the far wall of a room. Users guide a virtual hand through a sequence of spheres representing the solution path to reach a diamond. Once the diamond is obtained, a new level is automatically generated with a different trajectory, i.e., longer and more complex maze paths, with varying shapes and sizes. Movements involve elbow and shoulder flexion, and metacarpophalangeal flexion-extension.
3.
Fish: Set in a marine environment, users control an orange fish using wrist and elbow movements. The goal is to collide with point-gaining objects (bubbles, starfish) while avoiding harmful ones (rocks, sharks). Sensitivity settings adjust the fish’s response speed, allowing customization to the user’s range of motion. Higher levels increase the number, type, and speed of incoming objects. Moreover, as the levels progress, both the speed at which objects approach and the number and variety of objects increase. Table 1 shows the objects present in each level of the Fish scenario.
Table 2 illustrates the movement of the pawn represented by the fish (a manipulation metaphor) based on the user’s executed motions. For ulnar and radial deviation of the wrist, the direction of the pawn’s movement changes depending on whether the left or right UL is used. For the rest of the movements flexion/extension of wrist and elbow, the resulting displacement is the same with either hand. Initially, the user must place the wrist in a neutral position, with the forearm in pronation, the elbow semiflexed, and the arm positioned close to the side of the body.
Table 3 shows the final visualization of the three designed scenarios. In the screenshot of each scenario, the top section displays the current level, accumulated score, and elapsed time. Additionally, an instruction box appears in the bottom-left corner to guide users during the interaction. In the Bubbles and Maze scenarios, a virtual hand is used as the manipulation metaphor, while in Fish, the metaphor is represented by an orange fish.

2.3. Wearable Device Design

A custom wearable device was designed in the form of a bracelet, integrated with a 3D printed enclosure housing the electronic components, including an embedded system based on Arduino NANO, a Bluetooth module, a charging module, and a battery. The bracelet (secured using two adjustable straps) features two vibration motors that deliver vibrotactile feedback upon receiving a PWM signal from the Arduino NANO. This feedback enables the user to perform closed kinematic chain movements, enhancing proprioception by aiding in the localization of virtual objects. Figure 2 shows the wearable device worn on a participant’s arm.
The Arduino NANO board is powered by 5 V supplied from an MT3608 step-up regulator, that is connected to a 3.7 V lithium battery. In addition, a TP4056 charging module is included to recharge the battery.
As shown in Figure 3, communication between the wearable device and the VE in UE is achieved through the Universal Asynchronous Receiver-Transmitter (UART) serial communication protocol. Two HM-10 Bluetooth modules configured in a master-slave setup are used, operating at a frequency of 125 Hz (9600 baud) with an 8 byte transmission message. The master Bluetooth module is connected to the Arduino NANO, while the slave module interfaces with the PC via a serial-to-USB adapter.
It is important to highlight that, to enable interaction between the wearable bracelet and the VE, the following components were implemented in UE: (1) a “Serial COM” plugin specifically designed for Arduino and (2) a Blueprint for collision detection. These two components allow UE to transmit data to the Arduino NANO upon detecting collisions with objects in each scenario (bubbles, spheres, rocks, starfish, sharks, diamonds, walls, or workspace boundaries). Upon receiving this input, the embedded system sends an 8-bit PWM signal to one or both vibrotactile motors. In this configuration, UE sends a digital message (e.g., 140 activates one motor; more than 150 activates both motors; in all cases, the duty cycle can be adjusted within UE) to trigger a specific PWM signal on the Arduino NANO. As a result, the user perceives different vibration intensities and patterns depending on the object involved in the collision. Table 4 correlates scenarios and actions that trigger vibrotactile feedback.
Figure 4 shows the final integration of the components comprising the developed virtual reality rehabilitation system. The system was implemented on a Dell computer equipped with an Intel Core i7 processor, 16 GB of RAM, and an NVIDIA GeForce GTX 650 Ti graphics card.

3. Preliminary Experiments

To evaluate the system’s performance, 26 healthy participants aged between 18 and 45 years old were recruited. Participants were divided into two groups. In Group 1, 15 participants used the system without the wearable device; however, only 13 assessments were considered valid after discarding two due to factors that affected performance during testing, such as technical errors. Similarly, in Group 2, where the complete system (Leap Motion Controller with wearable device) was used, two evaluations were also excluded from 11 assessments.

3.1. Evaluation Protocol Description

Since this project requiered evaluating the user experience in healthy participants, and in this stage of the project was classified as minimal-risk research, it has been followed the principles established in the Declaration of Helsinki and the Regulations of the General Health Law on Health Research (Title Two: Ethical Aspects of Research in Human Subjects, Chapter I, Article 17). Moreover, authors submitted the protocol which has been approved by the Research Ethics Committee of the Faculty of Medicine, Autonomous University of the State of Mexico with number of register 008.2025 (see Appendix A).
It is worth mentioning that participants with a history of upper limb injuries were excluded. The entire session from the initial explanation to the completion of the final part of the questionnaire, lasted approximately 40 minutes. Moreover, the session was divided into two parts: one using the dominant hand and the other using the non-dominant hand. In the first part, which lasted about 19 minutes, the participant played all levels of each scenario. In the second part, which lasted approximately 2 minutes, the participant played only the final level of the Fish scenario.
Prior to participation, all individuals were first informed about the duration of the test, the study’s objectives, and the content of each session. They received a comprehensive informed consent form detailing the objectives of the research project, the interaction process with the VE, the testing procedure, and the confidentiality of data, which would be used exclusively for research purposes. If they agreed to participate, they signed the form.
To ensure clear understanding, an explanatory video was provided (click here to open the video link) illustrating the system’s functionality and demonstrating user interaction with the VE, such as a general overview of the scenarios, levels, exercises, and mechanics involved.
Afterwards, a questionnaire consisting of three sections was designed to collect participant feedback. Prior to starting the first experimental session, participants were asked to complete the first section of the questionnaire. The remaining sections were administered during the experiment, as described below. The three sections of the questionnaire are described as follows:
  • The first section (given before starting the test) collects demographic information from the participant, including their level of computer proficiency, experience with video games and VR, as well as prior use of optical motion trackers. This data enables an analysis of whether these variables are associated with user performance in the VE.
  • The second section (given after each scenario during the test) consists of a table with six questions to assess the mental workload of each scenario using the NASA Task Load Index (NASA TLX). The dimensions evaluated are: mental demand, physical demand, temporal demand, performance, effort, and frustration level. The first three relate to the task’s imposed workload, while the latter three pertain to the user’s interaction with the task. Each dimension is rated on a scale from 1 to 100, where 1 indicates a very low level and 100 a very high level. The Raw NASA TLX version [40] was used, as it allows for comparison of individual dimensions without assigning weights, thus avoiding the loss of relevant information. Additionally, as supported in [41], this version facilitates simpler application and contributes to more accurate responses.
  • The final section (administered at the end of the test) explores the user’s experience with the system. It comprises four subsections: the first assesses the user’s interest in the system as well as its aesthetic and pragmatic qualities through the short version of the AttrakDiff questionnaire, consisting of 10 pairs of opposing adjectives selected according to the user’s perception.
    The second subsection includes 10 questions based on the SUS scale [42] to assess system usability. Additionally, authors also include questions and statements (Table 5) to evaluate aspects such as physical discomfort, enjoyment of the scenarios, and the information provided by the VE.
    The remaining two subsections aim to identify specific errors encountered during the tests, whether within the VE or while using the wearable device, and to recognize opportunities for improvement.
Before starting the session, participants were given time to familiarize themselves with the system using the Bubbles scenario. The duration of this familiarization phase depended on each participant. The goal was to ensure proper hand placement under the LMC device, help them identify the interaction area, and observe how their movements were represented in the virtual environment. This phase was not timed to create a relaxed, pressure-free atmosphere.
Next, the first part of the session began, using the participant’s dominant hand with the Bubbles, Maze, and Fish scenarios, in that order. After completing the third level of each scenario, a brief pause was taken to complete the second section of the questionnaire. Then, the tutorial for the next scenario was shown before continuing.
For the second part, the researcher reconfigured the system to enable control with the non-dominant hand and selected the “By game and level” mode, choosing the third level of the Fish scenario. Upon completion, participants were asked to complete the third and final section of the questionnaire.

3.2. Experimental Testing

Functionality tests were conducted using two groups. The first group used the system without the integration of the wearable device, while the second group tested the complete system. Furthermore, computer proficiency was assessed using a self-reported scale ranging from 0 (low experience) to 10 (high experience), This information was used to characterize participants’ technical background in both experimental groups.
1.
Group 1 (System without Wearable Device): This group consisted of 6 men and 7 women. Regarding educational background, 8 participants were undergraduate students, 4 were master’s students, and 1 was a PhD student.
Based on the data collected in the first section of the demographic questionnaire, participants rated their computer proficiency as follows: two rated themselves as 9, one as 8.5, six as 8, three as 7, and one as 6. Weekly computer usage was reported as follows: four users between 1 and 10 hours, two between 11 and 20 hours, four between 21 and 30 hours, and three for more than 40 hours per week.
Regarding video game usage, six participants reported playing seldom, three occasionally, and four frequently. Finally, eight participants were familiar with VR and had previously used a VR system. One participant was familiar with the LMC; in contrast, five participants reported knowledge of the Kinect sensor.
2.
Group 2 (Complete System): This group included 4 men and 5 women. Regarding educational background, six participants had completed high school, two held a bachelor’s degree, and one held a master’s degree.
Based on the data collected in the first section of the demographic questionnaire, participants rated their computer proficiency as follows: two rated themselves as 9, two as 8, one as 7, and four as 6. Weekly computer usage was distributed as follows: two participants reported between 10 and 20 hours, three between 21 and 30 hours, three between 31 and 40 hours, one between 41 and 50 hours, and one for 60 hours per week.
Regarding video game usage, one participant did not play video games, three played seldom, two played occasionally, and three played always. Lastly, eight participants were familiar with VR, and five of them had previously used a VR system. No participants were aware of the LMC or had previously used it; in contrast, six had interacted with the Kinect sensor.

4. Analysis of Results

To assess the subjective perception of the VR system, participants completed the short AttrakDiff questionnaire, which evaluates both pragmatic and hedonic qualities of interactive products. Also, to evaluate the cognitive and physical demands associated with each VR scenario, NASA-TLX was employed, which captures perceived workload across six dimensions. Finally, system usability was evaluated with the System Usability Scale, providing a global score of ease of use, learnability, and user satisfaction associated with the system.

4.1. User Experience with the AttrakDiff-Short Questionnaire

Figure 5 shows the results of the AttrakDiff questionnaire, comparing user perceptions of the VR system with and without the wearable device. Overall, both system configurations obtained positive values across most word pairs, indicating that the VR system was generally perceived as usable and acceptable, even without the wearable device. This suggests that the Leap Motion–based interaction alone provides a functional and understandable interface for upper limb rehabilitation tasks.
However, the system with the wearable device (Group 2) consistently achieved higher positive scores across all evaluated dimensions. In particular, it was perceived as more simple, practical, and clearly structured, reflecting strong pragmatic quality. Additionally, it stood out in hedonic aspects, being described as more premium. In terms of overall appeal, users rated it as more good compared to the version without the device.
In contrast, the system without the wearable device (Group 1), although positively evaluated, received comparatively lower ratings and occasionally approached negative perceptions such as cheap, uncreative, or complicated. These findings indicate that, while both systems are functional, the integration of the wearable device not only enhances system functionality but also significantly enriches the user’s subjective experience by improving engagement, perceived quality, spatial correspondene, and interaction realism, making the system more intuitive and attractive.
Based on the results shown in Figure 5, the system type representation presented in Figure 6 was obtained. The VR system with the wearable device (represented in blue) was positioned within the desired region, indicating that users perceived it as both highly functional and attractive. This positioning reflects an optimal balance between pragmatic quality (task effectiveness) and hedonic quality (user engagement), which characterizes an ideal user experience.
In contrast, the VR system without the wearable device (represented in orange) was located in the self-oriented region. Although this system was generally perceived as functional, its position suggests a less engaging experience, mainly limited by reduced pragmatic quality. Consequently, further improvements in task support and interaction effectiveness are required for the system to reach the desired region, which, according to the results, can be achieved through the integration of the wearable device.

4.2. NASA-TLX Workload

Figure 7 presents the average scores obtained for each dimension across the three virtual environment (VE) scenarios for both Group 1 and Group 2. These results allow for the evaluation of the cognitive and physical demands associated with each VR scenario.
Regarding the Mental demand dimension, Group 1 reported average scores of 43.46, 37.69, and 50.76 in the Bubbles, Maze, and Fish scenarios, respectively. As can be seen, Group 2 obtained values of 30.55 in Bubbles, 36.66 in Maze, and 40.55 in Fish, Group 1 reported higher mental workload in the first and third scenarios, while a slight decrease is observed in the Maze scenario when the wearable device is used. According to previous studies, scores between 30 and 50 are considered suitable for medium-complexity tasks that require sustained attention while avoiding mental fatigue [43,44,45]. It is worth mentioning that keeping a low mental demand is particularly important in rehabilitation tasks, especially during the early learning phase, as high mental demand in VR applications can compromise motor learning [46].
In the Physical demand dimension, Group 1 reported average scores of 60.38 in Bubbles, 64.61 in Maze, and 74.07 in Fish, whereas Group 2 shown lower scores across all scenarios, with values of 47.77, 56.11, and 55.55 for the Bubbles, Maze, and Fish scenarios, respectively. These results indicate a higher perceived level of physical exertion in Group 1 compared with Group 2. The values obtained in Maze scenario indicate higher physical demand, associated with the involvement of multiple joints (shoulder, elbow, and phalanges), as well as with the wide range of the interaction area, which allows the execution of movements with greater amplitude. Similarly, the Fish scenario, which demands precise and controlled movements, was associated with higher perceived physical demand in Group 1 when controlling the pawn (represented by the fish). In contrast, Group 2 reported lower physical demand in this scenario when using the wearable device with vibrotactile feedback, which was associated with fewer compensatory movement patterns related to physical fatigue.
Regarding Temporal demand, Group 1 shown average scores of 63.53 in Bubbles, 45.38 in Maze, and 57.15 in Fish, whereas Group 2 obtained values of 48.88, 26.66, and 41.66 in the Bubbles, Maze, and Fish scenarios, respectively. Lower temporal demand values were consistently observed in Group 2 across all three scenarios. It is worth noting that the average scores obtained in the Maze scenario were lower for both groups compared with the other two scenarios, as level progression time depends on the speed at which the user performs the task. In contrast, in the Bubbles and Fish scenarios, level progression time is predetermined. According to [47], Temporal demand typically ranges from 14 to 61.5 and increases with task speed and resistance. Lower levels of urgency or time pressure have been shown to facilitate sustained engagement with the activity, which is particularly beneficial in rehabilitation contexts, as it reduces the likelihood of disengagement and supports task completion.
The results for the Effort dimension indicate that Group 1 consistently perceived a higher level of effort than Group 2 across all scenarios. In the Fish and Bubbles scenarios, Group 1 reported higher levels (70 and 78.07, respectively), whereas Group 2 (in Fish and Bubbles scenarios) reported moderate effort values (45 and 52.77). A similar trend was observed in the Maze scenario, where Group 1 reported (57.46) higher effort compared to Group 2 (45.55), although both groups remained within a moderate range. Overall, these findings suggest that the use of the wearable device was associated with a reduction in perceived effort, potentially indicating a more efficient interaction with the virtual environment.
For the Performance dimension, the results indicate comparable perceived performance between both groups in the Bubbles and Fish scenarios. In Bubbles, Group 1 obtained an average score of 58.07, while Group 2 scored 58.88, showing nearly identical perceptions of task success. Similarly, in the Fish scenario, both groups reported close values, with Group 1 scoring 49.61 and Group 2 scoring 56.11. In contrast, a more pronounced difference was observed in the Maze scenario. Group 1 reported an average score of 54.61, whereas Group 2 obtained a lower score of 38.88. Since lower scores in this dimension indicate a higher perceived level of performance, these results suggest that participants in Group 2 perceived their task execution in the Maze scenario as more efficient. This dimension reflects how successful participants felt when completing the tasks; values closer to zero indicate a higher perception of performance, whereas values approaching 100 represent poorer perceived performance. According to [41], intermediate scores such as those observed in this study indicate that participants did not perceive complete success, but also did not experience a strong sense of failure. In this context, the relatively higher Performance scores observed in Group 2 in the Bubbles and Fish scenarios indicate a less favorable perception of task performance, despite successful task completion. This outcome may be related to the subjective nature of the Performance dimension, as well as to the time-constrained execution characterizing these scenarios, which may increase participants’ sensitivity to minor errors or deviations and influence their self-assessment of performance.
Finally, for the Frustration dimension, the results reveal a differentiated pattern across scenarios. Lower levels of frustration were observed in Group 2 for the Bubbles (26.66 vs. 36.92 in Group 1) and Fish (32.84 vs. 20.55 in Group 1) scenarios. These results suggest that the integration of the wearable device contributed to reducing negative emotional responses during interaction, likely by improving movement guidance, spatial correspondence, and feedback clarity. In contrast, in the Maze scenario, Group 1 reported lower frustration levels (21.92) compared to Group 2 (30.9). This difference may be associated with the higher cognitive and motor coordination demands imposed by the Maze task, where the additional vibrotactile feedback could have increased attentional load or task awareness in Group 2, leading to a higher perception of frustration despite improved performance.
Overall, the results indicate that participants in Group 2 experienced lower frustration in two out of the three evaluated scenarios, even in cases where perceived performance was not consistently higher. This pattern suggests greater tolerance to task difficulty and reduced self-imposed pressure when interacting with the complete system. Conversely, Group 1, although reporting comparable or better perceived performance in some scenarios, exhibited higher frustration levels, which may be related to increased expectations, reduced sensory feedback, or greater sensitivity to errors during task execution.

4.3. System Usability Scale (SUS)

Table 6 and Table 7 report the individual SUS scores obtained by participants in Group 1 and Group 2, respectively, and Figure 8 summarizes the corresponding group averages within the standard SUS acceptability and adjective rating scales. Group 1 (system without the wearable device) achieved an average SUS score of 74.42, which is above the widely used benchmark value of 68 and is therefore classified as good usability [48]. When the wearable device was integrated, Group 2 reached a higher average SUS score of 80.83. According to the interpretation shown in Figure 8, both averages fall within the acceptable range; however, Group 2 shifts toward a higher adjective rating (from good toward excellent), indicating a clear improvement in perceived usability when using the complete system.
Regarding the statements and additional questions included in the SUS questionnaire, the results are presented in Figure 9. Group 2 exhibits a higher concentration of positive responses, particularly in the categories Clear information, Consistency, and Pleasant VE, where ratings of Strongly agree predominate. These results indicate that participants perceived the information presented in the virtual environment as clear and coherent, and that the movements performed were consistent with those displayed on screen. Overall, Group 2 shows fewer negative responses compared to Group 1.
In contrast, Group 1 displays a more dispersed response pattern, with a greater tendency toward negative ratings such as Disagree and Strongly disagree, especially in aspects related to Satisfaction, Motivation, and the perception of discomfort. It is worth noting that although some participants in Group 2 reported experiencing physical discomfort, this factor did not significantly affect the overall evaluation of the system. Taken together, these findings suggest that the integration of the wearable device contributes to a more positive and consistent user experience.
Figure 10 presents the results of the general system evaluation based on participant perception. When using the complete system (Group 2), none of the evaluated features received the lowest rating (1, poor), whereas Group 1 recorded poor ratings in 9 out of the 13 assessed features. This contrast indicates a more consistently positive perception of the system when the wearable device was integrated. The most relevant characteristics are discussed below.
In Group 2, the feature Experimental learning received the highest ratings, followed by General feeling, with most scores ranging between 4 and 5. Similarly, participants in Group 1 also assigned high scores to Experimental learning; however, ratings for General feeling were more variable, ranging from 2 to 5. Experimental learning reflects the system’s ability to support learning through practice, whereas General feeling describes participants’ emotional state after completing the tasks, with lower ratings indicating stress or frustration and higher ratings reflecting curiosity or calmness.
For the Feedback feature, which evaluates the quality of information provided to the user, Group 2 obtained more positive ratings. This improvement can be attributed to the inclusion of vibrotactile feedback, which complemented the visual and auditory feedback available in Group 1.
Regarding Movement precision, defined as the perceived similarity between users’ physical gestures and their visual representation on screen, Group 2 received higher ratings than Group 1, indicating a clearer correspondence between action and visual feedback.
In terms of Spatial correspondence, which refers to how effectively feedback supports the identification of the relative position of virtual objects within the manipulation metaphor, Group 2 demonstrated a notable improvement. The vibrotactile feedback enhanced spatial perception within the virtual environment, allowing participants to identify object locations with greater accuracy.
Finally, for the Quick response feature, understood as the ability of both the virtual environment and the wearable device to respond promptly to user actions—participants in Group 2 reported a clearer sense of immediacy, which contributed to smoother and more fluid interaction.
Overall, the results obtained from the SUS, AttrakDiff, and NASA-TLX evaluations provide a consistent and complementary assessment of the proposed system. The SUS results indicate that both configurations achieved acceptable usability levels, with a clear improvement when the wearable device was integrated. This improvement is further supported by the AttrakDiff analysis, where the complete system was positioned in the desired region, reflecting a balanced combination of pragmatic quality and hedonic appeal, while the system without the wearable device remained in a more self-oriented region. Finally, the NASA-TLX results reveal that the inclusion of the wearable device generally reduced perceived workload, frustration, and temporal pressure, while improving perceived performance in key scenarios. Together, these findings suggest that the wearable device not only enhances usability but also positively impacts user experience and interaction quality, resulting in a more efficient, engaging, and user-centered virtual rehabilitation system.

5. Discusion and Conclusions

The results obtained in this study allow a comprehensive discussion of the proposed system from the perspectives of usability, user experience, and perceived workload. The evaluation outcomes directly address the objectives established in Section 1, particularly the assessment of whether the integration of a wearable vibrotactile device improves interaction quality and user experience in a virtual environment for upper-limb rehabilitation.
From a usability standpoint, the SUS results indicate that both system configurations achieved acceptable usability levels. However, the system incorporating the wearable device consistently outperformed the baseline configuration, reaching higher usability ratings and a more favorable adjective classification. This suggests that while Leap Motion–based interaction alone is sufficient to support task execution, the addition of vibrotactile feedback provides a clearer and more efficient interaction paradigm.
The AttrakDiff evaluation further supports this observation. Participants interacting with the complete system positioned it within the desired region, reflecting a balanced combination of pragmatic quality and hedonic quality. In contrast, the system without the wearable device was perceived as more self oriented, indicating that although it was functional, it lacked elements that promote engagement and emotional involvement. These findings highlight the relevance of hedonic factors in rehabilitation oriented virtual environments, where sustained motivation and engagement are essential.
Workload related results obtained through the NASA-TLX questionnaire provide additional insight into user interaction. In general, participants using the wearable assisted system reported lower levels of frustration, temporal demand, and perceived effort across most scenarios. Even in tasks where performance differences were not pronounced, reduced frustration suggests greater tolerance to task difficulty and a more comfortable interaction experience. This is particularly relevant in rehabilitation contexts, where excessive cognitive or temporal pressure may negatively affect adherence.
A key factor underlying these improvements is spatial correspondence. The results show that vibrotactile feedback significantly enhances the perception of alignment between real and virtual actions, allowing users to identify the relative position of virtual objects with greater accuracy. Improved spatial correspondence not only contributes to better task execution but also strengthens immersion and confidence during interaction. These findings reinforce the role of multimodal feedback especially haptic cues in improving interaction quality within desktop based virtual rehabilitation systems.
Despite these advantages, some limitations were identified. In certain scenarios, such as the Maze task, the addition of vibrotactile feedback may increase attentional demands, leading to slightly higher perceived frustration. This suggests that feedback intensity and task complexity must be carefully balanced to avoid cognitive overload, particularly in more constrained or precision oriented tasks.
This work presented the design and evaluation of a wearable-assisted virtual system for upper limb rehabilitation. Integrating Leap Motion based interaction with vibrotactile feedback. The experimental results demonstrate that the inclusion of the wearable device leads to consistent improvements in usability, user experience, and perceived workload, fulfilling the primary objectives of the study.
The combined analysis of SUS, AttrakDiff, and NASA-TLX evaluations confirms that the wearable device enhances interaction quality by enriching feedback, improving spatial correspondence, and reducing frustration and temporal pressure during task execution. While the baseline system already provides acceptable usability, the complete system achieves higher usability ratings and a more favorable balance between pragmatic and hedonic qualities, resulting in a more engaging and intuitive user experience.
From a broader perspective, these findings underline the importance of incorporating haptic feedback into virtual rehabilitation environments, particularly in desktop based systems where immersion is otherwise limited. The proposed system demonstrates the potential of wearable vibrotactile devices to strengthen the perception of real virtual alignment and support more natural and effective interaction.
Nevertheless, this study represents an initial validation of the proposed approach. Future work will focus on further miniaturization of the wearable device to reduce weight and visual impact, as well as on refining feedback strategies to better adapt to task complexity. In addition, closer involvement of clinical specialists and future clinical validation studies will be essential to assess the system’s therapeutic effectiveness and its impact on motor rehabilitation outcomes.
Finally, it is important to emphasize that the proposed system is intended as a complementary tool to conventional physical therapy rather than a replacement for healthcare professionals. It supports rehabilitation processes by improving engagement, interaction quality, and user experience in virtual environments designed for motor rehabilitation.

Author Contributions

Conceptualization, E.R.S.-N., J.M.J-V. and M.R.-H.; methodology, E.R.S.-N., J.M.J-V. and M.R.-H.; software, E.R.S.-N. and M.R.-H.; validation, J.M.J-V., M.R.-H. and A.H.V-G; formal analysis, E.R.S.-N and J-M.J-V.; investigation, E.R.S.-N., J.M.J-V. and M.R.-H.; data curation, E.R.S.-N; writing—original draft preparation, E.R.S.-N. and J-M.J-V.; writing—review and editing, E.R.S.-N, J.M.J-V., M.R.-H. and A.H.V-G.; visualization, E.R.S.-N and J-M.J-V.; supervision, J.M.J-V., M.R.-H. and A.H.V-G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

This study was conducted in accordance with the Declaration of Helsinki, and the protocol was approved by the Research Ethics Committee of the Faculty of Medicine of the Autonomous University of the State of Mexico (CONBIOETICA-15-CEI-002-20210531, approval number: 008.2025) on October 03, 2025.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author (the data are not publicly available due to privacy or ethical restrictions.)

Acknowledgments

The authors would like to thank the volunteers and the physical therapists from the care organization CEREE for their time and assistance in the user-centered design.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
CEREE State Center for Rehabilitation and Special Education
GUI Graphical User Interface
HMDs Head-Mounted Displays
IVR Immersive Virtual Reality
LMC Leap Motio Controller
NASA TLX NASA Task Load Index
NUI Natural User Interface
SG Serious Games
SUS System Usability Scale
UE Unreal Engine
UL Upper Limb
VE Virtual Enviroment
VR Virtual Reality

Appendix A. Research Ethics Committee Approval (Spanish Version)

Preprints 199544 i001

Appendix B. Research Ethics Committee Approval (English Version)

Preprints 199544 i002

Appendix C. Informed Consent (English version)

Preprints 199544 i003aPreprints 199544 i003b

References

  1. Mahfouz, D.M.; Shehata, O.M.; Morgan, E.I.; Arrichiello, F. A Comprehensive Review of Control Challenges and Methods in End-Effector Upper-Limb Rehabilitation Robots. Robotics 2024, 13. [Google Scholar] [CrossRef]
  2. Belluzzi, E.; Pozzuoli, A.; Ruggieri, P. Musculoskeletal Diseases: From Molecular Basis to Therapy. Biomedicines 2024, 12. [Google Scholar] [CrossRef]
  3. Institute for Health Metrics and Evaluation (IHME). WHO Rehabilitation Need Estimator, 2024. Available from: https://vizhub.healthdata.org/rehabilitation/.
  4. Zhang, J.; Bradshaw, F.; Duchniewicz, M.; Karamatzanis, I.; Fernandes, F.W.; Krkovic, M. Epidemiology and Incidence of Upper Limb Fractures: A UK Level 1 Trauma Center Perspective. Cureus 2024, 16. [Google Scholar] [CrossRef]
  5. Champagne, N.; Eadie, L.; Regan, L.; Wilson, P. The effectiveness of ultrasound in the detection of fractures in adults with suspected upper or lower limb injury: a systematic review and subgroup meta-analysis. BMC Emergency Medicine 2019, 19. [Google Scholar] [CrossRef]
  6. Karzon, A.L.; Nazzal, E.M.; Cooke, H.L.; Heo, K.; Okonma, O.; Worden, J.A.; Hussain, Z.B.; Chung, K.C.; Gottschalk, M.B.; Wagner, E.R. Upper Extremity Fractures in the Emergency Department: A Database Analysis of National Trends in the United States. Hand 2024, 15589447231219286. [Google Scholar] [CrossRef]
  7. Karl, J.; Olson, P.R.; Rosenwasser, M. The Epidemiology of Upper Extremity Fractures in the United States, 2009. Journal of Orthopaedic Trauma 2015, 29, e242–e244. [Google Scholar] [CrossRef]
  8. Dalton, E.J.; Jamwal, R.; Augoustakis, L.; Hill, E.; Johns, H.; Thijs, V.; Hayward, K. Prevalence of Arm Weakness, Pre-Stroke Outcomes and Other Post-Stroke Impairments Using Routinely Collected Clinical Data on an Acute Stroke Unit. Neurorehabilitation and Neural Repair 2024, 38, 148–160. [Google Scholar] [CrossRef] [PubMed]
  9. Makki, D.; Duodu, bullet J; Nixon, M. Prevalence and pattern of upper limb involvement in cerebral palsy. Journal of Children’s Orthopaedics 2014, 8, 215–219. [Google Scholar] [CrossRef]
  10. Gimigliano, F.; Negrini, S.; et al. The World Health Organization" rehabilitation 2030: a call for action. European Journal of Physical and Rehabilitation Medicine 2017, 53, 155–168. [Google Scholar] [CrossRef]
  11. Riener, R.; Harders, M. Chapter 7 Virtual Reality for Rehabilitation. In Virtual Reality in Medicine; Springer London: London, 2012; pp. 160–180. [CrossRef]
  12. Rego, P.A.; Moreira, P.M.; Reis, L.P. Architecture for serious games in health rehabilitation. In Proceedings of the New Perspectives in Information Systems and Technologies; Springer, 2014; Volume 2, pp. 307–317. [Google Scholar]
  13. Sabatino, E.; Moschetta, M.; Lucaroni, A.; Barresi, G.; Ferraresi, C.; Podda, J.; Grange, E.; Brichetto, G.; Bucchieri, A. A Pilot Study on Mixed-Reality Approaches for Detecting Upper-Limb Dysfunction in Multiple Sclerosis: Insights on Cerebellar Tremor. Virtual Worlds 2025, 4. [Google Scholar] [CrossRef]
  14. Ríos-Hernández, M.; Jacinto-Villegas, J.M.; Zemiti, N.; Vilchis-González, A.H.; Padilla-Castañeda, M.A.; Debien, B. Development of a lumbar puncture virtual simulator for medical students training: A preliminary evaluation. The International Journal of Medical Robotics and Computer Assisted Surgery 2023, 19, e2572. [Google Scholar] [CrossRef]
  15. Laspro, M.; Groysman, L.; Verzella, A.N.; Kimberly, L.L.; Flores, R.L. The Use of Virtual Reality in Surgical Training: Implications for Education, Patient Safety, and Global Health Equity. Surgeries 2023, 4, 635–646. [Google Scholar] [CrossRef]
  16. Li, X.; Elnagar, D.; Song, G.; Ghannam, R. Advancing Medical Education Using Virtual and Augmented Reality in Low- and Middle-Income Countries: A Systematic and Critical Review. Virtual Worlds 2024, 3, 384–403. [Google Scholar] [CrossRef]
  17. Suh, I.; McKinney, T.; Siu, K.C. Current perspective of metaverse application in medical education, research and patient care. In Proceedings of the Virtual worlds. MDPI, 2023, pp. 115–128.
  18. Fealy, S.; Irwin, P.; Tacgin, Z.; See, Z.S.; Jones, D. Enhancing nursing simulation education: a case for extended reality innovation. In Proceedings of the Virtual Worlds. MDPI, 2023, pp. 218–230.
  19. Ríos-Hernández, M.; Jacinto-Villegas, J.M.; Portillo-Rodríguez, O.; Vilchis-González, A.H. User-centered design and evaluation of an upper limb rehabilitation system with a virtual environment. Applied Sciences 2021, 11, 9500. [Google Scholar] [CrossRef]
  20. Mubin, O.; Alnajjar, F.; Jishtu, N.; Alsinglawi, B.; Al Mahmud, A. Exoskeletons with virtual reality, augmented reality, and gamification for stroke patients’ rehabilitation: systematic review. JMIR rehabilitation and assistive technologies 2019, 6, e12010. [Google Scholar] [CrossRef]
  21. Cerda, I.H.; Therond, A.; Moreau, S.; Studer, K.; Donjow, A.R.; Crowther, J.E.; Mazzolenis, M.E.; Lang, M.; Tolba, R.; Gilligan, C.; et al. Telehealth and virtual reality technologies in chronic pain management: a narrative review. Current Pain and Headache Reports 2024, 28, 83–94. [Google Scholar] [CrossRef] [PubMed]
  22. Doumas, I.; Everard, G.; Dehem, S.; Lejeune, T. Serious games for upper limb rehabilitation after stroke: a meta-analysis. Journal of NeuroEngineering and Rehabilitation 2021. [Google Scholar] [CrossRef] [PubMed]
  23. Liao, K.L.; Huang, M.; Wang, Y.; Wu, Z.; Yang, R.; Zhang, J.; Wang, L. A Virtual Reality Serious Game Design for Upper Limb Rehabilitation. In Proceedings of the International Conference on Serious Games and Applications for Health (SeGAH), 2021. [Google Scholar] [CrossRef]
  24. Paraense, H.; Marques, B.; Amorim, P.; Dias, P.; Santos, B.S. Whac-A-Mole: Exploring Virtual Reality (VR) for Upper-Limb Post-Stroke Physical Rehabilitation based on Participatory Design and Serious Games. In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), 2022, pp. 716–717. 2022, 716–717. [CrossRef]
  25. Villada Castillo, J.F.; Montoya Vega, M.F.; Muñoz Cardona, J.E.; Lopez, D.; Quiñones, L.; Henao Gallo, O.; López, J.F. Design of Virtual Reality Exergames for Upper Limb Stroke Rehabilitation Following Iterative Design Methods: Usability Study. JMIR Serious Games 2024. [Google Scholar] [CrossRef]
  26. Bachmann, D.; Weichert, F.; Rinkenauer, G. Review of three-dimensional human-computer interaction with focus on the leap motion controller. Sensors 2018, 18, 2194. [Google Scholar] [CrossRef]
  27. Aguilera-Rubio, Á.; Cuesta-Gómez, A.; Mallo-López, A.; Jardón-Huete, A.; Oña-Simbaña, E.D.; Alguacil-Diego, I.M. Feasibility and efficacy of a virtual reality game-based upper extremity motor function rehabilitation therapy in patients with chronic stroke: A pilot study. International journal of environmental research and public health 2022, 19, 3381. [Google Scholar] [CrossRef]
  28. De Los Reyes-Guzmán, A.; Lozano-Berrio, V.; Álvarez Rodríguez, M.; López-Dolado, E.; Ceruelo-Abajo, S.; Talavera-Díaz, F.; Gil-Agudo, J. RehabHand: Oriented-tasks serious games for upper limb rehabilitation by using leap motion controller and target population in spinal cord injury. NeuroRehabilitation 2021. [Google Scholar] [CrossRef]
  29. Cuesta-Gómez, A.; Sánchez-Herrera-Baeza, P.; Oña-Simbaña, E.D.; Martínez-Medina, A.; Ortiz-Comino, C.; Balaguer-Bernaldo-de Quirós, C.; Jardón-Huete, A.; Cano-de-la Cuerda, R. Effects of virtual reality associated with serious games for upper limb rehabilitation in patients with multiple sclerosis: Randomized controlled trial. Journal of neuroengineering and rehabilitation 2020, 17, 1–10. [Google Scholar] [CrossRef] [PubMed]
  30. Hidalgo, J.C.C.; Delgado, J.D.A.; Bykbaev, V.R.; Bykbaev, Y.R.; Coyago, T.P. Serious game to improve fine motor skills using Leap Motion. In Proceedings of the 2018 Congreso Argentino de Ciencias de la Informática y Desarrollos de Investigación (CACIDI). IEEE, 2018, pp. 1–5.
  31. Shah, V.; Cuen, M.; McDaniel, T.; Tadayon, R. A rhythm-based serious game for fine motor rehabilitation using leap motion. In Proceedings of the 2019 58th Annual conference of the society of instrument and control engineers of Japan (SICE). IEEE, 2019, pp. 737–742.
  32. Postolache, G.; Carry, F.; Lourenço, F.; Ferreira, D.; Oliveira, R.; Girão, P.S.; Postolache, O. Serious Games Based on Kinect and Leap Motion Controller for Upper Limbs Physical Rehabilitation. In Modern Sensing Technologies; Mukhopadhyay, S.C.; Jayasundera, K.P.; Postolache, O.A., Eds.; Springer International Publishing: Cham, 2019; Vol. 29, pp. 147–177. Series Title: Smart Sensors, Measurement and Instrumentation. [CrossRef]
  33. Maldonado, P.; Muñoz, D.; Barría, P.; Alberti, P.; Ojeda, C. Proyecto de realidad virtual inclusivo para pacientes con discapacidad motriz. In Proceedings of the Memorias Congreso Iberoamericano de Tecnologías de Apoyo a la Discapacidad, Bogotá, Colombia, 2017; pp. 339–345.
  34. Somrak, A.; Pogačnik, M.; Guna, J. Suitability and Comparison of Questionnaires Assessing Virtual Reality-Induced Symptoms and Effects and User Experience in Virtual Environments. Sensors 2021, 21. [Google Scholar] [CrossRef]
  35. Kartick, P.; Uribe-Quevedo, A.; Rojas, D. Piecewise: A Non-Isomorphic 3D Manipulation Technique That Factors Upper-Limb Ergonomics. Virtual Worlds 2023, 2, 144–161. [Google Scholar] [CrossRef]
  36. Slater, M.; Sanchez-Vives, M.V. Enhancing our lives with immersive virtual reality. Frontiers in Robotics and AI 2016, 3, 74. [Google Scholar] [CrossRef]
  37. Pacchierotti, C.; Sinclair, S.; Solazzi, M.; Frisoli, A.; Hayward, V.; Prattichizzo, D. Wearable haptic systems for the fingertip and the hand: taxonomy, review, and perspectives. IEEE Transactions on Haptics 2017, 10, 580–600. [Google Scholar] [CrossRef] [PubMed]
  38. Ultraleap, 2022.
  39. Weichert, F.; Bachmann, D.; Rudak, B.; Fisseler, D. Analysis of the Accuracy and Robustness of the Leap Motion Controller. Sensors (Basel, Switzerland) 2013, 13, 6380–6393. [Google Scholar] [CrossRef]
  40. Nygren, T.E. Psychometric Properties of Subjective Workload Measurement Techniques: Implications for Their Use in the Assessment of Perceived Mental Workload. Human Factors 1991, 33, 17–33. [Google Scholar] [CrossRef]
  41. Hart, S.G. NASA-task load index (NASA-TLX); 20 years later. In Proceedings of the Proceedings of the human factors and ergonomics society annual meeting, 2006; Sage publications Sage CA: Los Angeles, CA; pp. 904–908. [Google Scholar]
  42. Grier, R.A.; Bangor, A.; Kortum, P.; Peres, S.C. The system usability scale: Beyond standard usability testing. In Proceedings of the Proceedings of the human factors and ergonomics society annual meeting. SAGE Publications Sage CA: Los Angeles, CA, 2013, pp. 187–191.
  43. Grier, R.A. How high is high? A meta-analysis of NASA-TLX global workload scores. In Proceedings of the Proceedings of the human factors and ergonomics society annual meeting. Sage Publications Sage CA: Los Angeles, CA, 2015, pp. 1727–1731.
  44. Krepki, R.; Blankertz, B.; Curio, G.; Müller, K.R. The Self-Paced Brain–Computer Interface: Possible Applications to Motor Rehabilitation. Journal of Neural Engineering 2007, 4, 346. [Google Scholar] [CrossRef]
  45. Levenson, M.R.; Bourke, A.; Harper, D.; Matarić, M. Feasibility of Cognitive-Motor Exergames in Geriatric Inpatient Rehabilitation: A Pilot Randomized Controlled Study. JMIR Serious Games 2021, 9, e27027. [Google Scholar] [CrossRef]
  46. y N. Schweighofer y S. Liew, J.M.J. El aumento de la carga cognitiva en la realidad virtual inmersiva durante la adaptación visuomotora se asocia con una disminución de la retención a largo plazo y la transferencia de contexto. Journal of NeuroEngineering and Rehabilitation, 2021. [CrossRef]
  47. Chiari, L.; Dozza, M.; Cappello, A.; Horak, F.B.; Macellari, V.; Giansanti, D. Difficulty Adaptation in a Competitive Arm Rehabilitation Game Using Real-Time Control of Arm Electromyogram and Respiration. IEEE Transactions on Neural Systems and Rehabilitation Engineering 2017, 25, 1823–1832. [Google Scholar] [CrossRef]
  48. Hyzy, M.; Bond, R.; Mulvenna, M.; Bai, L.; Dix, A.; Leigh, S.; Hunt, S.; et al. System usability scale benchmarking for digital health apps: meta-analysis. JMIR mHealth and uHealth 2022, 10, e37290. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Proposed system with its components.
Figure 1. Proposed system with its components.
Preprints 199544 g001
Figure 2. Designed wearable device with the 3D printed enclosure and bracelet.
Figure 2. Designed wearable device with the 3D printed enclosure and bracelet.
Preprints 199544 g002
Figure 3. Block diagram which represents the electronic connections and communication of the wearable device with the PC.
Figure 3. Block diagram which represents the electronic connections and communication of the wearable device with the PC.
Preprints 199544 g003
Figure 4. Final integration and components of the rehabilitation system.
Figure 4. Final integration and components of the rehabilitation system.
Preprints 199544 g004
Figure 5. AttrakDiff Word-Pair Diagram for the Upper Limb Rehabilitation System Evaluation.
Figure 5. AttrakDiff Word-Pair Diagram for the Upper Limb Rehabilitation System Evaluation.
Preprints 199544 g005
Figure 6. System type representation according to the AttrakDiff evaluation.
Figure 6. System type representation according to the AttrakDiff evaluation.
Preprints 199544 g006
Figure 7. Average scores by dimension (mental, physical, temporal, effort, performance, frustration) from the NASA RTLX questionnaire.
Figure 7. Average scores by dimension (mental, physical, temporal, effort, performance, frustration) from the NASA RTLX questionnaire.
Preprints 199544 g007
Figure 8. Comparison of average SUS scores by adjective ratings and overall acceptability range.
Figure 8. Comparison of average SUS scores by adjective ratings and overall acceptability range.
Preprints 199544 g008
Figure 9. Additional questions to the SUS questionnaire.
Figure 9. Additional questions to the SUS questionnaire.
Preprints 199544 g009
Figure 10. General system evaluation according to the perception of the participants.
Figure 10. General system evaluation according to the perception of the participants.
Preprints 199544 g010
Table 1. Objects present in the Fish scenario.
Table 1. Objects present in the Fish scenario.
Level Desired objects Undesired objects
1 - Starfish - Rocks
2 - Starfish - Bubbles - Rocks
3 - Starfish - Bubbles - Rocks - Sharks
Collision outcome Increases the number of hits. Increases the number of errors.
Table 2. Fish movement based on upper limb joint actions.
Table 2. Fish movement based on upper limb joint actions.
Movements Left UL Right UL
Wrist
Ulnar deviation Moves to the left Moves to the right
Radial deviation Moves to the right Moves to the left
Flexion Moves downward
Extension Moves upward
Elbow
Flexion Moves backward
Extension Moves forward
Table 3. Developed Virtual Scenarios.
Table 3. Developed Virtual Scenarios.
Scenario                   Movements
Bubbles: Intermediate level.
Preprints 199544 i004
  • Pinch (metacarpophalangeal flexion and slight interphalangeal flexion).
  • Forearm pronation-supination.
Maze: Easy level.
Preprints 199544 i005
  • Shoulder flexion-extension.
  • Elbow flexion-extension.
  • Metacarpophalangeal flexion-extension.
Fish: Intermediate level.
Preprints 199544 i006
  • Ulnar and radial deviation.
  • Wrist flexion-extension.
  • Elbow flexion-extension.
Table 4. Vibrotactile feedback provided by the wearable device based on user actions.
Table 4. Vibrotactile feedback provided by the wearable device based on user actions.
Scenario Actions
Bubbles Popping bubbles, collision with bubbles, depositing in containers.
Maze Collision with spheres, collecting diamonds, collision with the back wall.
Fish Collision with desired and undesired objects, workspace limit.
Table 5. Questions and statements included in the SUS questionnaire table.
Table 5. Questions and statements included in the SUS questionnaire table.
Question Keywords
Is the information provided by the virtual environment clear? Clear information
Do the on-screen movements make sense in relation to my own? Coherence
I liked the game environments. Pleasant VE
Did you experience any physical discomfort or pain while using the system? Physical discomfort
Did you experience any visual or auditory discomfort while using the system? Other discomfort
I successfully completed the activities. Satisfaction
I felt motivated to finish the games. Motivation
I think that vibration modes are distinguished according to the actions performed. Vibration
Table 6. SUS Results – Group 1.
Table 6. SUS Results – Group 1.
User 1 2 3 4 5 6 7 8 9
SUS Score 87.5 82.5 70 52.5 85 52.5 62.5 72.5 82.5
User 10 11 12 13
SUS Score 77.5 82.5 70 90
Average: 74.42
Table 7. SUS Results – Group 2.
Table 7. SUS Results – Group 2.
User 1 2 3 4 5 6 7 8 9
SUS Score 75 82.5 82.5 85 85 70 82.5 80 85
Average: 80.83
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2026 MDPI (Basel, Switzerland) unless otherwise stated