Preprint
Article

This version is not peer-reviewed.

Mapping Blended Learning Activities to Students’ Digital Competence in VET

Submitted:

16 November 2025

Posted:

18 November 2025

You are already at the latest version

Abstract
The blended learning format can help students develop their digital literacy skills, but the design of the blended learning model and the student factors important to this development remain undefined. The purpose of this research was to explore the relationship between blended learning design models and digital literacy skills in a sample of 106 upper vocational education and training (VET) students. The data analysis examined the relationships between activities, competences, and prior experience with blended learning. Student engagement with collaborative, task-based instructional designs was positively associated with digital competence (for example, collaborative project work and regular use of quizzes with collaboration tools, online communication, and technology for learning). In contrast, the association between live session use and pre-recorded video use and digital competence was weaker than for other teaching approaches. Use of Virtual Reality/Augmented Reality and interactive video was positively associated with students' use of digital tools but not with their perceptions of online safety or content creation. Students with previous experience of blended learning reported higher developmental competence (content creation; learning/research) than those without such experience. The results also showed that VET students can be grouped into three categories based on their technical field and experience with blended learning. Overall, the findings suggest that structured collaboration and formative assessment should be emphasized in the implementation of the blended learning model.
Keywords: 
;  ;  ;  ;  
Subject: 
Social Sciences  -   Education

1. Introduction

The effectiveness of blended learning in developing digital competences among students is influenced by several key factors, as highlighted across multiple studies. Firstly, the quality of technology integration plays a crucial role, with research indicating that the perceived quality of technology integration significantly predicts students’ digital competencies and engagement, more so than the mere frequency of technology use [1]. Additionally, Arranz-García and Secades [2] found that project-based blended learning improved students’ specific, general, and systemic digital skills, though it did not enhance their interpersonal digital skills. The integration of face-to-face and online components is also essential, with the quality of face-to-face activities supporting higher-order processing, social interaction, and engagement, which are critical for effective blended learning [3].
Additional studies have examined the quantitative effects of blended learning on students’ digital competency. Martínez-Alcalá et al. [4] found that a blended workshop increased students’ digital literacy scores, while Sánchez et al. [5] reported that a Moodle-based blended course improved both procedural and attitudinal digital competencies but had no effect on cognitive competency. Flexible blended models have also been shown to support problem-solving and related digital skills [6]. Overall, these findings indicate that practice-based and engaging blended learning consistently increases students’ procedural, attitudinal, and problem-solving digital competencies.
Finally, Feng et al. [7] found that perceived teacher support plays a crucial role in shaping students’ ICT self-efficacy and their engagement in online English learning. The study highlights that when students perceive strong and consistent support from their teachers, their confidence in using ICT tools
In terms of self-efficacy, technical skills, digital literacy, and adaptability, studies show that students with prior experience in blended and/or online learning rate their digital competence much more favorably than those experiencing blended learning for the first time. As an example, Marsh & Johnson [8] reported that students with prior experience rated themselves as being more confident and demonstrated superior navigation skills when compared to students without prior experience; García-Ortega and Galán-Cubillo [9], similarly, identified fewer challenges related to adapting to digital tools on the part of students with prior experience as well.
Intrinsic motivation is linked to digital competence in settings where technology-enhanced, authentic learning environments meet students’ psychological needs. In one study, vocational students in a mobile augmented reality car maintenance program experienced features such as scaffolding and real-time feedback that were moderately correlated with motivational dimensions like relevance, indicating that digital engagement underpinned this intervention. Although digital competence was not measured directly here, the use of digital tools implies baseline skill engagement. In another study set in a project-based engineering capstone, students who experienced environments emphasizing autonomy, competence, and relatedness reported enhanced acquisition of 21st-century skills, including ICT competence. This study details a reciprocal process in which the development of new skills further meets intrinsic motivational needs, thereby fostering continued competence growth. Together, the studies support that in vocational education and training, intrinsic motivation and digital competence acquisition are mutually reinforcing when learning contexts are designed to integrate technology, offer timely instructional support, and promote authentic, self-directed experiences.
The literature indicates that a higher degree of intrinsic motivation has been associated with a higher level of digital competence in the majority of educational settings using various types of digital tools for learning. Posekany et al. [10] documented an increase in students’ perceived competence and relatedness in a digital empowerment course at the university level; Amin et al. [11] reported a strong correlation between digital competence and academic motivation/lifelong learning among nursing undergraduates. Altınpulluk et al. [12] also indicated a statistically significant but relatively small association between self-directed learning and intrinsic motivation in a MOOC study; Pratiwi et al. [13] found that digital communication competence was the most important predictor of motivation explaining approximately 69.7% of the variance. The role of mediating and/or moderating factors in this relationship appears to be significant, as well. In some cases, self-directed learning and academic motivation have served as mediators in the relationships between other variables and intrinsic motivation; however, the strength of the relationship can be moderated by several variables including field of study, gender, technology access, and prior digital experience. As mentioned earlier, the methodologies used in the research varied widely including validated Self-Determination Theory measures and researcher-created scales, as well as: pre-post comparison, cross-sectional survey design, and mixed-methods design. Regardless of methodology, there is a consistent pattern in the research indicating that in the contexts studied, students’ levels of intrinsic motivation are positively related to their levels of digital competence.
In addition to exploring the relationship between the implementation of blended learning and students’ perceptions of developing digital competence, the present study focuses on three main areas of investigation. First, the present study will investigate the correlations between students’ participation in specific blended learning activities and the development of specific digital competencies. While much of the previous research has focused on general trends in students’ development of digital competence, research findings suggest that different pedagogical approaches employed in blended learning environments may foster the development of different digital competency dimensions. Second, the present study will explore whether students’ prior experience with blended learning moderates the amount of self-reported growth in digital competence, specifically examining whether prior exposure to blended learning models increases students’ perceptions of their own competency growth. Third, the present study will examine the relationship between students’ intrinsic motivation and their perceptions of growing in digital competence, drawing upon evidence that motivational factors significantly impact digital literacy outcomes and technology use. Through investigating these interconnected research questions, the present study will gain insight into the processes through which blended learning environments promote students’ development of digital competence and identify those factors that may affect the relationship between blended learning environments and students’ development of digital competence. Therefore, the following research questions were developed to guide the present study:
(1)
How do specific blended learning activities correlate with the development of specific digital competences?
(2)
How do students with no prior blended learning experience differ in their self-reported digital competence development compared to those with prior experience?
(3)
Is there a positive association between students’ intrinsic motivation and their perceived development of digital competences?
(4)
To what extent can distinct clusters of students be identified based on their digital competence profiles, and how are these clusters associated with demographic characteristics and teaching approaches used in the courses?

2. Materials and Methods

2.1. Participants

A convenience sampling method was used for this study. A teacher from each of seven VET program participants in the pilot study sent links to surveys to their students. Students participated voluntarily; and 106 students chose to complete the survey. Of those 106 students, 98 were male and 7 were female, with one student indicating they would prefer not to identify as either male or female. The average age of the students who completed the survey was 17.1 years old. Based on the typical demographics of students who enroll in VET programs (e.g., Computer Technician, Auto Technician, Electrical Engineering Technician, and Mechanical Engineering Technician), it was anticipated that the majority of the students completing the survey would be male. In addition, all participants were students in 4th or 5th year of an upper-secondary, technical program. Specifically, among the participants, 56 were in their 3rd year of school, 36 were in their 2nd year, 11 were in their 1st year, and 3 were in their 4th year when the data was collected.

2.2. Measures

Students completed a survey that consisted of an initial section collecting demographic information about the student (i.e., gender, age, name of the school, educational program, etc.) and current level of study; the second section collected data about the student’s experience with blended learning (i.e., teaching methods, workload perceptions, types of learning, and motivation).

2.2.1. Teaching Approaches

The 10-item scale provided a measure of the commonly used teaching methods. For each item, students rated on a five-point scale how frequently a method of instruction was used in their class (from “0—Never” to “4—Almost Always”). Table 1 illustrates descriptive statistics for the question ‘During the blended learning pilot, how often did your course(s) include the following?’ Each item is listed by mean.
In accordance with Table 1, students reported that the most frequent use of teaching methods during the implementation of BL included live online meetings (e.g., using Zoom), teacher-created instructional video presentations, and the least frequent was receiving online feedback from the teacher.

2.2.2. Digital Competences Improvements

Student’s digital competence improvement was measured by an assessment instrument based on the checklist based on the DigCompEdu framework. This tool provided a way to measure student perception of their digital competence improvement through the blended learning experience. The measurement tool allowed students to indicate in which of nine areas of digital competence that they believed they had developed skills during the blended learning experience by marking all relevant boxes: (1) Using digital tools and software; (2) Understanding online safety and privacy; (3) Organizing digital files and folders; (4) Collaborating using online collaboration tools; (5) Communicating with others online; (6) Creating, publishing, and sharing digital content; (7) Using technology for learning and research; (8) Understanding digital citizenship and ethics; and (9) Solving common technical problems with computers. Students could choose more than one category to provide feedback of how many areas of digital competence that they have developed skills in. All responses were then converted into numerical codes (0 = Not Selected; 1 = Selected). The results of this assessment were reported in the table below.
Table 2 shows that students most often improved practical digital skills through blended learning, especially in using digital tools and software (71.3%), online communication (67.8%), and collaboration tools (65.5%). Fewer reported progress in creating content (41.4%) or understanding digital ethics (40.2%), suggesting stronger gains in operational than in critical or ethical com

2.2.3. Prior Blended Learning Experience

We assessed students’ prior experience with blended learning using a single dichotomous question: “Have you ever been involved in blended learning before?” Students selected either “Yes” or “No”. Responses were coded as 0 (No) and 1 (Yes) in the database. Results are shown in the table below.
According to Table 3, just over half of the students (54.7%) reported having no prior experience with blended learning, whereas 45.3% indicated previous participation in blended learning environments.

2.2.4. Student Motivation

The evaluation of how teaching techniques impacted student motivation was conducted using several of the scales in the Intrinsic Motivation Inventory (IMI). The IMI has been validated through many studies before ours. Four of the seven IMI scales were utilized in this study along with slight adjustments. The “Interest/Enjoyment” (IMI-I) scale contained four self-report items (for example, “I would call BL very interesting”), measured overall interest in the material. The “Effort/Importance” (IMI-E) subscale was made up of four items (for instance, “I put a lot of time into my BL course”) and measures the degree to which an activity is important to one’s motivation. The “Perceived Competence” (IMI-C) subscale consisted of four self-report items (such as, “After doing BL for a little while, I felt pretty confident”), is linked to self-report and observable intrinsic motivation. The “Value/Usefulness” (IMI-V) subscale contained four items (for example, “I believe the things I did in BL could have some worth”), measures the degree to which an activity is seen as valuable and/or relevant. Each item was rated on a five-point Likert-type scale from 1 (“strongly disagree”) to 5 (“strongly agree”).
Table 4 summarizes the results of the students’ ratings for each of the scales in the IMI; the data indicated that the blended learning activities were both motivating and enhancing to their confidence in performing the tasks. The averages for effort/importance were just under those of the other three scales but still above the midpoint on the rating scale, indicating the students generally experienced the courses positively.

2.3. Data Analysis

Pearson correlation was utilized for RQ1 to analyze how many times the students participated in the online learning activities and developed the various digital competencies. Students’ participation frequency answers were recorded as 5-point scales (1 = Never, 5 = Almost always), whereas digital competence questions were coded as either 0 = Not selected or 1 = Selected. To measure the association between each individual activity and competence, Pearson correlations and all other analyses were calculated using SPSS version 21, following standard procedures for bivariate correlations, significance testing, and output interpretation.
For RQ2, chi-square tests of independence were applied to investigate whether there are significant differences in the development of digital competencies between students who have had previous experience in blended learning and those that have not. In addition to examining overall digital competence, each of the nine competencies were analyzed individually as separate binary outcome measures (i.e., selected v. not selected) with prior experience being the grouping variable (prior experience = 1, no prior experience = 0). It should be noted that to meet the assumption of having at least five observations in every cell, expected frequencies were examined. If this assumption was violated, Fisher’s Exact Test was used instead. An effect size was also computed using Cramér’s V to determine the degree of difference found in the groups.
To address RQ3, both descriptive and inferential statistics were used to evaluate the relationship between students’ intrinsic motivation (IMI) and their development of digital competencies. The first step in this process was to calculate the means for each of the four IMI subscales (Interest/Enjoyment, Perceived Competence, Value/Usefulness, and Pressure/Tension) by calculating the mean of the items that corresponded to the individual subscales and reversing coding for the negative worded items before creating the subscale scores. Reliability for each subscale was estimated by computing Cronbach’s alpha coefficient. Next, a Total Digital Competence Index was created by counting the number of competencies checked in. Pearson correlation was then used to explore the bivariate relationships between each of the IMI subscales and the Total Digital Competence Index. Following this, hierarchical multiple regression was used to model the Total Digital Competence Index as a function of the four IMI subscales and to identify which of the IMI subscales had the greatest predictive power with regard to developing digital competencies.
Hierarchical cluster analysis was used to address RQ4 and was employed to divide the sample into distinct groups based on the student’s performance in developing the 9 different digital skills listed in Q10a-Q10i. Recoding of the original missing values (-1,-3) to “missing” and deletion of students with missing data (listwise) occurred. The hierarchical cluster analysis was completed in SPSS 21 using the average linkage (between-groups) method and squared Euclidean distances, which is suitable for clustering binary data in this version of the software. The structure of the cluster solution was evaluated by assessing the dendrogram and agglomeration coefficients, and the number of clusters was identified as the largest increases in the fusion coefficients and visual separation in the dendrogram. Membership in the clusters was retained for additional comparison of the groups.

3. Results

This section presents the study’s findings in relation to each of the four research questions: (1) Correlation between online learning activities and digital competence development; (2) Comparisons among participants who have had previous blended learning experiences; (3) Relationship between intrinsic motivation and digital competence; (4) Cluster analysis of digital competence profiles.

3.1. Correlations Between Blended Learning Activities and Digital Competence Development (RQ1)

Table 5 displays the Pearson correlations between frequency of online learning activities and students’ self-reported digital competence indicators.
The findings in Table 5 suggest that interactive and collaborative learning formats have the highest positive association with developing digital competence. Specifically, group projects had a very high positive correlation with a number of competencies including collaboration tools (r = .59, p < .001), online communication (r = .61, p < .001), and using digital tools (r = .31, p < .001). Likewise, frequent quizzes were positively correlated with the competencies; using digital tools (r = .36, p < .001), collaboration tools (r = .28, p < .01), and technology for learning (r = .25, p < .01), which indicates that structured engagement and feedback enhance students’ ability to apply their digital skills. The other two formats of learning (VR/AR and interactive video) showed varying degrees of association with digital competence. While both formats were positively correlated with using digital tools (r = .31, p < .001; r = .28, p < .01, respectively), there was a small or negative correlation between these formats and both online safety (r = - .18, p < .05; r = -.12, ns) and content creation (r = - .21, p < .01; r = -.20, p < .05), indicating that novelty is not sufficient to produce competence. Traditional or passive learning formats (live sessions and pre-recorded videos) displayed low or non-significant associations with all three areas of digital competence. Overall, the results indicate that active and task-based learning formats, particularly those with opportunities for collaboration (e.g., group projects) and assessment (e.g., quizzes), are the most effective at enhancing students’ overall digital competence across a variety of areas.

3.2. Differences in Competence Development by Prior Blended Learning Experience (RQ2)

Table 6 presents the percentage of students reporting each digital competence level, comparing those with and without prior blended learning experience. To assess if a student’s prior exposure to blended learning will impact their self-assessment of their digital competence as it relates to different digital skills, the percentage of students who reported being confident with each of the skills is presented in Table 6, comparing students who had prior exposure to blended learning and those that did not have prior exposure to blended learning.
The results show consistent and often significant differences between the two groups. Students without prior blended learning experience reported higher confidence in several basic and tool-related competences, such as using digital tools and software (82.7% vs. 54.3%, χ² = 6.91, p < .01), understanding online safety and privacy, using online collaboration tools, and troubleshooting computer problems. Conversely, students with blended learning experience scored higher on more advanced, creative, or learning-oriented competences—particularly creating and publishing digital content (57.1% vs. 30.8%) and using technology for learning and research (70.3% vs. 45.3%).
Although several of the χ² values indicate statistically significant differences, the overall pattern suggests that previous blended learning may encourage deeper, learner-centered uses of technology, while those without such experience appear more confident in routine technical operations.

3.3. Relationship Between Intrinsic Motivation and Perceived Digital Competence Growth (RQ3)

To examine the relationship between students’ overall digital competence and their intrinsic motivation components, Pearson correlation analyses were conducted between the total Digital Competence Index and each intrinsic motivation subscale (Table 7).
Table 7 examines the associations between the Digital Competence Index and Intrinsic Motivation Subscales. Results indicated there was an extremely high association, positively, between students’ level of digital competence and several of the motivational components. Value/Usefulness had the highest correlation to the overall competence (r = .76, p < .001), which indicates students who see digital skills as being useful or valuable are more likely to be at higher levels of competence than those that do not. Perceived Competence demonstrated a moderate, positive association to the competence index (r = .59, p < .001), indicating a relationship exists between a student’s self-efficacy and their actual level of digital competence. A moderate, but statistically significant association existed for effort/importance (r = .53, p < .001) and interest/enjoyment (r = .30, p < .001), which suggests a motivation to invest time and attention into digital activities will result in higher competence. High inter-correlation existed among the motivation sub-scales (specifically, perceived competence and interest/enjoyment (r = .70, p < .001)) and suggest students who have confidence in their ability to learn digitally also have enjoyment from the learning process. Collectively the results suggest an interaction effect between competence and motivation; where students enjoy, value and believe they can use digital skills will significantly increase the likelihood of them demonstrating greater digital proficiency.

3.4. Student Clusters Based on Digital Competence Profiles and Related Course Factors (RQ4)

Three student categories were found through the hierarchical cluster analysis, as a result of the way that the demographic, educational and experiential attributes of the students differed from one another. Cross-tabulation tests were used to explore whether there are associations between cluster groupings and other categorical variables: gender, technical subject area, study level (year) and previous blended learning experience. The distribution of students across these variables is shown in Table 8.
There was no significant association identified between gender and cluster membership, χ²(2, N = 87) = 0.09, p = .954, meaning that gender distribution is consistent across all clusters; therefore, gender will be excluded as an interpretive label for cluster membership. However, a significant relationship was identified between cluster membership and student’s technical discipline, χ²(4, N = 87) = 19.18, p < .001, Cramér’s V = .33, meaning that students in mechanical and mechatronics disciplines were over-represented in Cluster 3 (90%), while students in electrical and computer disciplines were more represented in Clusters 1 & 2. Furthermore, study year did not have a significant association with cluster membership, χ²(2, N = 87) = 1.12, p = .572, which suggests that students who started their program earlier or later in time were distributed relatively evenly. On the other hand, students’ prior experiences in blended learning significantly differed by cluster membership, χ²(2, N = 87) = 11.65, p = .003, Cramér’s V = .37, where 90% of Cluster 3 students reported some form of prior blended learning experience when compared to approximately 1/3 of Cluster 1 & 2 students.
Cluster names were determined through a synthesis of students’ academic progress, technical area of study, and prior blended learning engagement. These elements were interpreted as indicators of blended learning competence. Cluster 1 was termed Advancing Practitioners primarily consisting of upper-level students with technical specialization in the middle range and limited prior blended learning experience; thus indicating developing competence. The largest and most diverse cluster was labeled Emerging Practitioners, consisting of primarily first year students with minimal exposure to blended learning; this indicates foundational competence. The final cluster, comprising predominantly upper-level mechanical/mechatronic students with extensive prior blended learning experience, was labeled Experienced (High Competence) Practitioners.
Collectively, the above-mentioned clusters provide evidence of a continuum of development in blended learning competence from emerging (Cluster 2) to advancing (Cluster 1) to experienced (Cluster 3). The differences observed across the clusters are in terms of educational maturity, technical specialization, and prior blended learning participation rather than demographic differences.
To gain insight into how different participant groups integrate the use of individual instruction techniques within the context of the blended learning course, the frequencies of various teaching approaches utilized across three competence-based clusters were examined using ANOVA (see Table 9).
As shown by the data in Table 9, there is little difference in the teaching methods that the three different competence-based clusters reported using for each activity. Most mean scores range from 1.4 to 2.6 on a scale of 0-4, which implies that the teaching strategies were rarely or never used. There was no statistical significance in the first five items in terms of live sessions, video lectures, expert talks, group or online projects, and breakout room interactions, so it appears that all three clusters used them at similar frequencies.
There was greater variability in the use of the next set of strategies. Significant effects were found for live sessions (F = 2.91, p < .05), expert talks (F = 3.15, p < .05) and group or online projects (F = 4.13, p < .01). These results indicate that participants who were classified as high competence level practitioners tend to be more likely to use more interactive and collaborative ways of instruction than the other two groups. Although the differences are small, this could indicate an evolving trend from more teacher centered ways of instructional delivery to more student participative ways of instructional delivery as students move up the competence level scale.

4. Discussion

The current study assessed how various blended learning activities (e.g., quizzes, group projects, VR/AR, interactive video), prior experience with blended learning formats and students’ intrinsic motivation relate to students’ digital competence and if distinct competence profiles correspond to different instruction approaches. In doing so, it addressed a significant void in the literature by relating specific pedagogical activities to multivariate indices of digital competence and largely within a single cohort and not treat “use of technology” as a generic category. Additionally, by combining correlational data, group comparisons, and cluster analysis, the study provided a more detailed description of how participatory design and motivational factors interact to influence competence development.

4.1. Participatory Activities and Competence Development

Participatory activities generally align with overall competence development. Designs that promote collaborative and task-based interaction (notably group projects and frequent low-stake quizzes) show the greatest number of positive correlations with multiple competence indicators (collaboration tools, communication, technology for learning). These findings are supported by meta-analytic evidence showing that both active and collaborative learning produce improved performance and transferable skills in higher education [14,15] and by research on the “testing effect,” which demonstrates that frequent practice and feedback enhance applied knowledge and strategic tool usage [16]. These studies collectively suggest that digital competences are best developed when students are required to co-create products, coordinate tasks and iteratively access and utilize information—conditions that are inherent to well-designed group work and formative assessments.

4.2. Impact of Immersive Technologies on Competence

Immersive technologies, such as VR, AR, and interactive video, show positive relationships with digital tool use but do not consistently benefit all aspects of learning, as neither online safety nor content creation exhibit significant positive associations, and prior research warns that highly immersive media may increase cognitive load or divert focus from metacognitive regulation and transfer [17,18]. However, educational applications of augmented reality have been found to significantly enhance learners’ motivation, confidence, attention, creativity, and satisfaction by offering immersive and interactive experiences, which not only increase engagement but also support the development of digital competences and self-directed learning [19]. These results suggest that while novel or interactive media can rapidly boost relevant fluency, explicit scaffolding—such as guidance for responsible use and structured authoring tasks—is essential to translate the novelty of these technologies into meaningful competence development.

4.3. Effect of Prior Blended Learning Experience

Prior blended experience distinguishes between “routine” versus “developmental” competences. Students who did not report prior blended learning experience reported significantly higher levels of confidence regarding routine, tool-based operational competences (e.g., basic software use, troubleshooting) compared to those students reporting prior blended experience who reported higher levels of advanced, learning-based competences (e.g., content creation, utilizing technology for learning/research). These distributions are consistent with models that differentiate technical/operational competencies from strategic and creative aspects of digital competence [20,21]. They also support evidence demonstrating that authentic, repeated inclusion of technology within course assignments produces a transition from procedural know-how to goal-directed, disciplinary utilization [22]. The present findings further suggest that prior blended learning experience acts as a developmental lever; students are encouraged to move beyond “tool-handling” and towards generative application.

4.4. Interdependence of Motivation and Competence

Motivation and competence appear to be interdependent. The Digital Competence Index was found to correlate most strongly with perceived value/usefulness and perceived competence, followed by effort/importance and interest/enjoyment. According to Self-Determination Theory, perceived value (identified regulation) and efficacy beliefs interact to predict long-term commitment to engagement and performance [23]. Meta-analytic research has shown that intrinsic motivation and identified motivation have been consistently linked to performance and persistence in cognitively demanding tasks [24]. Collectively, the present pattern suggests a mutual reinforcement of competence and motivation; as students perceive the utility of digital skills and perceive themselves as competent in applying them, they invest more time and derive more interest, further strengthening their competence. Instructionally, this suggests that incorporating value-affirming activities with direct relevance and structured success experiences is important.

4.5. Competence Clusters and Interactive Teaching Approaches

Competence clusters correlate (to some degree) with more interactive teaching approaches. The three competence clusters—emerging, advancing, experienced—most clearly differed in terms of prior blended learning experience and disciplinary background; experienced practitioners reported slightly greater use of interactive/collaborative approaches (group projects). Although effects were weak, they are directionally consistent with research linking stronger self-regulation and technology-supported collaboration to enhanced engagement in blended learning environments [25]. Collectively, the present findings reinforce a developmental continuum: as competence and prior experience grow, learners tend toward or profit more from participatory designs.
The main purpose of this study was to triangulate multiple analytic lenses (correlations, group comparisons, clustering) within a coherent framework, enabling inferences about activity–competence alignment and motivational covariation. It operationalizes digital competence across several domains rather than relying on a single proxy, improving construct coverage. With these advantages in mind, the study also has some limitations. First, the cross-sectional, self-report design precludes causal claims and may inflate activity–competence associations due to common-method variance. Second, several cells—particularly in cluster-by-discipline cross-tabs—were small, limiting precision and generalizability. Finally, the single-cohort, domain-specific context may limit external validity to other institutions or disciplines.

4.6. Implications and future directions,

The results of this study suggest that when designing educational experiences using digital tools and multimedia, the instructor should prioritize two design elements to increase students’ overall digital competences. The first is structuring collaboration in a way that clearly defines the role(s) of each participant(s) in the collaborative process, and the artifacts (i.e., materials) they will be working with. Second, providing students with frequent opportunities to receive feedback on their performance can help students develop both the technical skills required to operate digital tools and other digital competences (such as problem-solving, critical thinking, etc.). As an example, when students are interacting through virtual reality (VR)/augmented reality (AR), or video, the instructor should provide additional scaffolding for students to learn about how to safely and ethically interact within those environments. Additionally, the instructor should also teach students how to create digital content by including additional assignment types (creation assignments) and rubrics to guide student reflection to extend the benefits of operating digital tools beyond just the basic operation of them. In addition to the instructional design strategies described above, when developing instructional designs that incorporate digital tools and multimedia, instructors should consider integrating value-affirming frames into their instruction and feedback to strengthen students’ sense of efficacy and further enhance the motivation-competence spiral.
The future research should utilize longitudinal or experimental methods to investigate causal relationships between specific instructional design features (e.g., interdependence structure of group work; retrieval practice schedule) and distinct competence domains. Future research should also explore mediators of this relationship (i.e., perceived value, self-efficacy) and potential contextual moderators (i.e., discipline, year of study). Fine-grained learning analytics (i.e., trace data collected from Learning Management System (LMS) or video platforms) would allow researchers to collect empirical evidence to supplement self-report measures and to identify specific mechanisms (e.g., quality of collaboration, timing of feedback) through which instructional designs influence student development of digital competences.

5. Conclusions

This research adds to our current understanding in several ways: that participatory, feedback rich activities have the strongest association with the development of multiple dimensions of digital competence; that prior experience with blending is associated with more developmentally oriented, learning focused dimensions of digital competence; that motivation and competence are closely related; and that those who possess higher levels of competence, tend to be more likely to engage in collaborative pedagogy. The findings of this research will allow researchers and learning designers to develop more refined design principles for blended courses and demonstrate the potential for increasing the speed at which learners move from operational fluency to generative, discipline specific digital practices by linking active learning with motivational scaffolds.

Author Contributions

Conceptualization, D.M.R. and M.R.; methodology, D.M.R. and M.R.; validation, D.M.R. and M.R.; formal analysis, M.R.; investigation, D.M.R. and M.R.; resources, D.M.R. and M.R.; data curation, M.R.; writing—original draft preparation, D.M.R.; writing—review and editing, D.M.R. and M.R.; visualization, D.M.R. and M.R.; supervision, D.M.R.; project administration, D.M.R. and M.R.; funding acquisition, D.M.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by EEA Grants, grant number ATP213.

Institutional Review Board Statement

Ethical approval was not required, as educational opinion surveys in Slovenia do not fall under ethics committee review. Nonetheless, we upheld the highest ethical standards: data were collected anonymously, participation was voluntary, and responses remained confidential.

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

Data will be made available upon request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. T. Consoli, M.-L. Schmitz, C. Antonietti, P. Gonon, A. Cattaneo, and D. Petko, “Quality of technology integration matters: Positive associations with students’ behavioral engagement and digital competencies for learning,” Educ. Inf. Technol., vol. 30, no. 6, pp. 7719–7752, Apr. 2025. [CrossRef]
  2. O. Arranz-García and V. A. Secades, “Development of Project-Based Learning (PBL) in Blended Learningmode for the Acquisition of Digital Competence,” in Proceedings of the International Conference on e-Learning 2019, IADIS Press, July 2019, pp. 291–295. [CrossRef]
  3. J. Buhl-Wiggers, A. Kjærgaard, and K. Munk, “A scoping review of experimental evidence on face-to-face components of blended learning in higher education,” Stud. High. Educ., vol. 48, no. 1, pp. 151–173, 2022. [CrossRef]
  4. C. I. Martínez-Alcalá et al., “Digital Inclusion in Older Adults: A Comparison Between Face-to-Face and Blended Digital Literacy Workshops,” Front. ICT, vol. 5, Aug. 2018. [CrossRef]
  5. A. Sánchez et al., “Development of digital competence for research,” Appl. Syst. Innov., vol. 5, no. 4, p. 77, 2022. [CrossRef]
  6. D. Keržič, A. Aristovnik, N. Tomaževič, and L. Umek, “Evaluating the impact of e-learning on students` perception of acquired competencies in a university blended learning environment,” J. E-Learn. Knowl. Soc., vol. 14, no. 3, pp. 65–76, 2018. [CrossRef]
  7. L. Feng, L. He, and J. Ding, “The Association between Perceived Teacher Support, Students’ ICT Self-Efficacy, and Online English Academic Engagement in the Blended Learning Context,” Sustainability, vol. 15, no. 8, p. 6839, 2023. [CrossRef]
  8. D. Marsh and C. Johnson, “The Laureate English Program Taking a Research Informed Approach to Blended Learning,” High. Learn. Res. Commun., vol. 3, no. 1, pp. 45–55, 2013. [CrossRef]
  9. B. Garcia-Ortega and J. Galan-Cubillo, “How to improve students’ experience in blending learning? Evidence from the perceptions of students in a Postgraduate Master’s Degree,” WPOM-Work. Pap. Oper. Manag., vol. 12, no. 2, pp. 1–15, June 2021. https://doi.org/10.4995/wpom.15677. [CrossRef]
  10. A. Posekany, G. Nöhrer, D. Haselberger, and F. Kayali, “Analyzing Students’ Motivation for Acquiring Digital Competences,” in 2023 IEEE Frontiers in Education Conference (FIE), IEEE, Oct. 2023, pp. 1–9. [CrossRef]
  11. S. M. Amin et al., “Nursing education in the digital era: the role of digital competence in enhancing academic motivation and lifelong learning among nursing students,” BMC Nurs., vol. 24, no. 1, May 2025. [CrossRef]
  12. H. Altinpulluk, H. Kilinc, G. Alptekin, Y. Yildirim, and O. Yumurtaci, “Self-Directed Learning and Intrinsic Motivation Levels in MOOCs,” Open Prax., vol. 15, no. 2, pp. 149–161, 2023. [CrossRef]
  13. S. R. Pratiwi, A. Malik, and T. S. Hadi, “The Influence of Digital Communication Competence on Student Motivation,” Lekt. J. Ilmu Komun., vol. 8, no. 2, pp. 1–12, 2025. [CrossRef]
  14. S. Freeman et al., “Active learning increases student performance in science, engineering, and mathematics.,” Proc. Natl. Acad. Sci. U. S. A., vol. 111, no. 23, 2014, doi: 10/gctkrm.
  15. L. Springer, M. E. Stanne, and S. S. Donovan, “Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: A meta-analysis,” Rev. Educ. Res., vol. 69, no. 1, pp. 21–51, 1999. [CrossRef]
  16. H. L. Roediger and J. D. Karpicke, “Test-Enhanced Learning: Taking Memory Tests Improves Long-Term Retention,” Psychol. Sci., vol. 17, no. 3, pp. 249–255, Mar. 2006. [CrossRef]
  17. G. Makransky and L. Lilleholt, “A structural equation modeling investigation of the emotional value of immersive virtual reality in education,” Educ. Technol. Res. Dev., vol. 66, no. 5, pp. 1141–1164, 2018. [CrossRef]
  18. J. Radianti, T. A. Majchrzak, J. Fromm, and I. Wohlgenannt, “A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned, and research agenda,” Comput. Educ., vol. 147, p. 103778, Apr. 2020. [CrossRef]
  19. M. Z. Iqbal, E. Mangina, and A. G. Campbell, “Current Challenges and Future Research Directions in Augmented Reality for Education,” Multimodal Technol. Interact., vol. 6, no. 9, p. 75, Sept. 2022. [CrossRef]
  20. L. Ilomäki, S. Paavola, M. Lakkala, and A. Kantosalo, “Digital competence—An emergent boundary concept for policy and educational research,” Educ. Inf. Technol., vol. 21, no. 3, pp. 655–679, 2016. [CrossRef]
  21. E. van Laar, A. J. A. M. van Deursen, J. A. G. M. van Dijk, and J. de Haan, “The relation between 21st-century skills and digital skills: A systematic literature review,” Comput. Hum. Behav., vol. 72, pp. 577–588, July 2017. [CrossRef]
  22. J. Tondeur, K. Aesaert, B. Pynoo, J. van Braak, N. Fraeyman, and O. Erstad, “Developing a validated instrument to measure preservice teachers’ ICT competencies: Meeting the demands of the 21st century,” Br. J. Educ. Technol., vol. 48, no. 2, pp. 462–472, 2017, doi: 10/f9tcj8.
  23. R. M. Ryan and E. L. Deci, “Self-determination theory and the facilitation of intrinsic motivation, social development, and well-being,” Am. Psychol., vol. 55, no. 1, pp. 68–78, 2000. [CrossRef]
  24. C. P. Cerasoli, J. M. Nicklin, and M. T. Ford, “Intrinsic motivation and extrinsic incentives jointly predict performance: A 40-year meta-analysis,” Psychol. Bull., vol. 140, no. 4, pp. 980–1008, 2014. [CrossRef]
  25. J. Broadbent and W. L. Poon, “Self-regulated learning strategies & academic achievement in online higher education learning environments: A systematic review,” Internet High. Educ., vol. 27, pp. 1–13, Oct. 2015. [CrossRef]
Table 1. Descriptive statistics for the scale of teaching approaches.
Table 1. Descriptive statistics for the scale of teaching approaches.
Teaching Approaches N M SD
Live online sessions (e.g., Zoom, Teams) 101 2.5 1.18
Pre-recorded teacher presentations or videos 100 2.17 1.21
Expert presentations (live or recorded) 102 2.08 1.21
Online group projects or presentations 101 2.05 1.13
Small group interactions or breakout rooms 100 2.03 1.23
Frequent quizzes or short assignments 100 1.98 1.23
VR or AR technology activities 100 1.83 1.22
Interactive video activities 102 1.62 1.24
Online vocational skill-building coursework 100 1.44 1.25
Online instructor feedback and guidance 100 1.21 1.31
Note: Min. = 0; Max. = 4.
Table 2. Digital Competence Areas Improved Through Blended Learning (N = 87).
Table 2. Digital Competence Areas Improved Through Blended Learning (N = 87).
Digital Competence N %
Using digital tools and software (e.g., Office, Google Docs) 62 71.3
Understanding online safety and privacy 39 44.8
Managing digital files and folders 55 63.2
Using online collaboration tools (e.g., Google Drive, Teams) 57 65.5
Communicating effectively online (e.g., email, chat) 59 67.8
Creating and publishing digital content (e.g., blogging, videos) 36 41.4
Using technology for learning and research 50 55.6
Understanding digital citizenship and ethics 35 40.2
Troubleshooting common computer problems 51 58.6
Table 3. Prior Experience with Blended Learning.
Table 3. Prior Experience with Blended Learning.
Prior Experience with Blended Learning N %
Yes 48 45.3
No 58 54.7
Total 106 100
Table 4. Descriptive statistics for the student’s motivation scale.
Table 4. Descriptive statistics for the student’s motivation scale.
Perceived Motivation (Scales) N M SD
Live online sessions (e.g., Zoom, Teams 94 3.66 0.85
Pre-recorded teacher presentations or videos 94 3.44 0.81
Expert presentations (live or recorded) 94 3.66 0.82
Online group projects or presentations 94 3.59 0.88
Note: Min. = 1; Max. = 5.
Table 5. Pearson Correlations Between Online Learning Activities and Digital Competence Indicators.
Table 5. Pearson Correlations Between Online Learning Activities and Digital Competence Indicators.
Online Learning Activity 1 2 3 4 5 6 7 8 9
Live sessions .14 .00 .11 .18* .02 .02 -.04 .15 .00
Pre-recorded videos .00 .01 .01 -.07 -.05 .06 -.08 -.04 -.12
Expert talks .02 .15 -.07 .00 -.12 .09 -.01 .16 .12
Group projects .31*** .15 .02 .59*** .61*** .08 .09 .20* .06
Small-group interaction .06 -.16 -.03 .11 -.08 -.16 .03 .00 .00
Frequent quizzes .36*** -.05 .20* .28** .10 .12 .25** .18 .02
VR/AR activities .31*** -.18* -.08 .10 -.04 -.07 -.20* .17 .02
Interactive video .28** -.12 -.10 .02 .01 -.21** -.20* .15 .11
Challenging coursework -.10 -.02 .08 -.10 -.05 .30** .13 .12 -.08
Instructor feedback -.02 -.23** .07 -.05 .01 .05 .06 -.04 -.12
Note. N ranged from 86 to 90 across correlations. *p < .05; **p < .01; p < .001. 1 - Using digital tools; 2 - Online safety; 3 - File management; 4 - Collaboration tools; 5 - Online communication; 6 - Digital content creation; 7 - Tech for learning; 8 - Digital citizenship; 9 - Troubleshooting.
Table 6. Percentage of Students Reporting Digital Competence by Prior Blended Learning Experience.
Table 6. Percentage of Students Reporting Digital Competence by Prior Blended Learning Experience.
Digital Competence Indicator Yes (n = 35) No (n = 52) χ²
Using digital tools and software 54.30% 82.70% 6.91**
Understanding online safety and privacy 31.40% 53.80% 3.39*
Managing digital files and folders 54.30% 69.20% 1.42
Using online collaboration tools 51.40% 75.00% 4.15*
Communicating effectively online 77.10% 61.50% 1.67
Creating and publishing digital content 57.10% 30.80% 4.96*
Using technology for learning and research 70.30% 45.30% 4.54*
Understanding digital citizenship and ethics 22.90% 51.90% 6.19**
Troubleshooting common computer problems 40.00% 71.20% 7.14**
Note. Percentages reflect the proportion of students who selected each digital competence item within each group. Chi-square tests assess group differences. Valid N ranged from 87 to 90. *p < .05; **p < .01; p < .001.
Table 7. Pearson Correlations Between Total Digital Competence Index and Intrinsic Motivation Subscales.
Table 7. Pearson Correlations Between Total Digital Competence Index and Intrinsic Motivation Subscales.
Variable 1 2 3 4
1. Index of Competences 1
2. Interest/Enjoyment .30*** 1
3. Effort/Importance .44*** .53*** 1
4. Perceived Competence .27** .70*** .59*** 1
5. Value/Usefulness .23** .78*** .47*** .76***
Note. *p < .05; **p < .01; p < .001. Sample sizes ranged from N = 90 to N = 94.
Table 8. Distribution of Students Across Clusters by Key Variables (N = 87).
Table 8. Distribution of Students Across Clusters by Key Variables (N = 87).
Student
Characteristics
Cluster 1:
Advancing
Practitioners
Cluster 2:
Emerging
practitioners
Cluster 3:
Experienced
practitioners
Gender
Male 21 (91.3%) 50 (92.6%) 9 (90.0%)
Female 2 (8.7%) 4 (7.4%) 1 (10.0%)
Technical Field
Mechanical & Mechatronic 4 (17.4%) 14 (25.9%) 9 (90.0%)
Electrical & Computing 17 (73.9%) 34 (63.0%) 1 (10.0%)
Creative & Communication Tech. 2 (8.7%) 6 (11.1%) 0 (0.0%)
Study Year
1–2 years 8 (34.8%) 24 (44.4%) 3 (30.0%)
3–4 years 15 (65.2%) 30 (55.6%) 7 (70.0%)
Prior Experience
Yes 8 (34.8%) 18 (33.3%) 9 (90.0%)
No 15 (65.2%) 36 (66.7%) 1 (10.0%)
Table 9. Descriptive Statistics and ANOVA Results for Teaching Approaches by Competence-Based Cluster (N = 87).
Table 9. Descriptive Statistics and ANOVA Results for Teaching Approaches by Competence-Based Cluster (N = 87).
Teaching approach Cluster 1
Advancing Practitioners
M (SD)
Cluster 2
Emerging Practitioners
M (SD)
Cluster 3
Experienced Practitioners
M (SD)
F(2, 84)
Live sessions (Zoom, Teams) 1.65 (1.23) 1.81 (1.21) 1.60 (1.58) 0.21
Teacher video lectures 2.22 (1.24) 2.15 (1.12) 2.40 (1.27) 0.20
Expert talks 1.61 (1.16) 1.43 (1.20) 1.40 (1.58) 0.18
Group or online projects 2.00 (1.09) 2.43 (0.94) 1.80 (0.92) 2.67
Breakout room interaction 2.17 (1.19) 2.13 (1.12) 1.60 (1.58) 0.93
Live sessions (Zoom, Teams) 2.36 (1.29) 2.98 (0.92) 3.00 (1.05) 2.91*
Teacher video lectures 1.74 (1.42) 2.04 (1.49) 1.00 (1.33) 2.21
Expert talks 2.43 (1.34) 2.30 (1.09) 1.40 (0.84) 3.15*
Group or online projects 1.78 (1.31) 2.19 (1.08) 3.00 (0.82) 4.13**
Breakout room interaction 2.57 (1.08) 2.63 (1.03) 3.00 (1.33) 0.60
Note. Scores range 0–4; higher values = more frequent use of each teaching approach. *p < .05; **p < .01; p < .001.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated