Preprint
Article

This version is not peer-reviewed.

Primary Teachers’ Use of AI: Practices, Professional Learning, and Integration Barriers

Submitted:

24 November 2025

Posted:

25 November 2025

You are already at the latest version

Abstract
This qualitative case study aimed to investigate the use of artificial intelligence (AI) in primary school educators’ teaching and professional practices. It provides a unique perspective on grassroots experimentation, teacher barriers, and ethical dilemmas in a context of minimal policy direction. Fifteen educators were involved through semi-structured interviews, concurrent think-aloud protocols, and review of lesson plans. Data were interpreted through an expanded Technology Acceptance Model incorporating elements of the Unified Theory of Acceptance and Use of Technology (UTAUT) and the AI-TPACK framework to account for individual institutional and contextual factors. Findings indicate that educators employed AI tools for lesson planning, teaching content differentiation, assessment, and interactive media development, and self-directed professional learning. However, their adoption was constrained by several barriers, including inadequate infrastructure, fragmented and non-experiential training, ethical ambiguity, and a lack of school-based technical and pedagogical support. Moreover, educators were constrained further by time constraints, cultural resistance, and uneven leadership, most relying on personal initiative instead of systemic. The Concerns-Based Adoption Model (CBAM) was not part of the a priori research design, but turned out to be a suitable framework to describe how educators use of AI evolves over time. The study’s results are summarised into a five-pillar framework for responsible and sustainable integration of AI, including systemic governance, equitable access, ongoing embedded training, curriculum and pedagogy alignment, and stakeholder engagement. Although grounded in the Cypriot context, the findings are of relevance to education systems around the world looking to incorporate AI in meaningful ways into teaching and learning.
Keywords: 
;  ;  ;  ;  ;  
Subject: 
Social Sciences  -   Education

1. Introduction

Education is experiencing a rapid technological revolution in teaching, technology, and administrative work which can offer new opportunities for innovation and improved learning. At the same time this transformation intensifies educators’ workloads, decreases job satisfaction, and increases educators’ techno-stress (Gabbiadini et al., 2023; Molino et al., 2020).
An important component in this progression is Artificial Intelligence (AI) as it offers automation, more personalized learning, and more inclusive learning experiences (Chen et al., 2020; Lu et al., 2021). Educators all over the world are experimenting with AI in different areas, including assessment, lesson planning, content generation, and feedback. Nevertheless, meaningful integration remains limited due to educators’ training gaps, ethical considerations, inadequate infrastructure, and a lack of institutional support (Zawacki-Richter et al., 2019; Holmes et al., 2022).
European initiatives, such as the Digital Education Action Plan 2021-2027 and the Digital Decade 2030, outline objectives related to AI literacy, digital inclusion, and ethical frameworks (European Commission, 2023). However, the implementation across nations varies with digitally advanced countries like Estonia and Finland to have made substantial progress in integrating AI into education, while others, including Cyprus, still remain in the early stages of adoption. In this context Cyprus can be considered as an illustrative and revealing case study. Although a National AI Strategy was adopted in 2020 (Deputy Ministry of Research, Innovation and Digital Policy, Cyprus), a dedicated advisory task force was established in January 2025, indicating a notable gap between strategy and ability to implement. Moreover, the curriculum has not been updated to include AI, its incorporation into teaching materials remains minimal, and the training of teachers has been ad-hoc and inconsistent.
Nevertheless, the establishment of a national Advisory Committee on AI in Education (Republic of Cyprus Governance, 2025) suggests increasing interest, at the policy level, and sets the need for guidance and support systems as urgent. Recent legislation, however, illustrates a mixed picture of the Ministry’s objectives. For instance, the “Regulations for the Operation of Public Secondary Schools (Amendment) 2024” (R.A.A. 168/2024), issued on 1st November 2024, bans the use of mobile phones in public schools. According to the new regulation, students are required to switch off their devices and keep them in a secure location, except in cases of emergency or explicit educational use (Republic of Cyprus, 2024). Similarly, distance learning is only permitted in extraordinary cases, such as pandemics or natural disasters. Conclusively, there is no legislation supporting the regular use of AI and other advanced digital tools in classroom practice reflecting that there is an urgent need for legislation that supports the ethical use of AI, teacher training, infrastructure, and data privacy.
The national primary school curriculum includes a dedicated “Digital Competence” unit within the “New Technologies” module, for Grades E’ and ST’, but content is limited, pedagogical guidance is minimal, and resources are scarce. A recent curriculum update in 2024 (Ministry of Education, 2024) focuses primarily on the use of technological materials rather than the development of digital competence. Consequently, even in Grades E’and ST’, meaningful reinforcement of digital competence has not been achieved in practice. Lower Grades (A’-D’) have no structured digital competence indicators.
Empirical evidence from Kasinidou et al. (2025) supports that Cypriot teachers demonstrate intermediate levels of digital skills, moderate levels of AI literacy, and generally neutral attitudes toward AI, indicating that policy changes alone are insufficient without deeper training, update of pedagogical approaches, adequate resources, and effective implementation. Additionally, Karousiou’s (2025) qualitative study with Cypriot school leaders shows that infrastructure, professional development, curriculum, and bureaucratic inertia continue to slow down the digital transformation.
Therefore, this paper discusses the current state of the implementation of AI in the primary school classrooms in Cyprus. Although it is set in the Cypriot context, its findings reflect broader universal concerns about the outdated policy frameworks, educators’ grassroots innovation, and the challenges associated with integrating AI in education. It provides empirical evidence from an under-researched context, contributing to the literature on the integration of AI in primary education. It provides insights for policy, teacher professional learning, and curriculum reform.

2. Purpose of the Study

This study was initiated in November 2024 and seeks to map the current use of Artificial Intelligence (AI) tools by primary school educators in Cyprus, offering insights into educators’ practices, perceptions, and challenges prior to the national rollout of an AI strategy.
It addresses the following research questions:
1. How do educators use AI tools in their professional and teaching practices?
2. What factors influence the adoption of AI tools by educators?
3. What impact does the use of AI tools have on their professional and teaching practices?

3. Methodology

This study employed a qualitative case study design to investigate the integration by Cypriot educators of AI tools, into primary school professional and teaching practices. It was chosen for its suitability in capturing complex, context-sensitive experiences within environments where technological readiness varies significantly.
Purposive and convenience sampling strategies were utilized for participant recruitment. Initially, 24 educators expressed interest and enrolled in the study, but only 15 completed both the training sessions and classroom implementation phases. Participants represented a range of teaching experience, levels of digital literacy, and geographic locations, providing a diverse set of perspectives.
Data was collected through three main instruments:
  • One pre-use interview to explore their initial perceptions, expectations, and concerns regarding AI tools, and one post-use interview to gather their experiences, reflections, and any challenges encountered.
  • Concurrent Think-Aloud sessions during educators utilisation of AI tools for preparation of their lessons, capturing participants’ reflections and decision-making processes.
  • One Lesson plan from each educator as tangible evidence of pedagogical application, since participants designed and implemented AI-supported lessons.
All data collection was completed by April 2025.

Data Analysis

Data were analyzed using a thematic content analysis approach, combining inductive coding with categorization informed by an expanded Technology Acceptance Model (TAM) framework. The process followed Bengtsson’s (2016) four-step framework: decontextualization, recontextualization, categorization, and compilation.
Initial open coding was conducted manually by the lead researcher using printed transcripts and digital spreadsheets of all collected data which were carefully read and reread to identify emerging ideas and patterns. Codes were systematically recorded, organized, and compared across participants and data types. The coding process followed a structured sequence of three interrelated techniques: open coding, axial coding, and selective coding (Strauss, 1987; Strauss & Corbin, 1990). To complement the qualitative analysis separate tables were created to record the number of segments corresponding to each coding category emerging through data. This quantitative representation was considered important as, in combination with the principle of iteration (Spiggle, 1994), enhanced the transparency and reliability of the analysis. It also highlighted the relative significance of the categories, enabled comparisons across data, and contributed to triangulation by combining qualitative interpretation with quantitative evidence. Themes were developed through repeated comparison, constant refinement, and grouping of semantically related codes -combining inductive insights with theory-informed categorization grounded in the expanded Technology Acceptance Model (TAM) framework. The same approach was systematically applied to address each of the three research questions.
In order to enhance credibility of the findings, participants were invited to review and reflect on their own interview transcripts and concurrent think-aloud protocols. This member checking process (Lincoln & Guba, 1985; Birt et al., 2016) allowed them to verify the accuracy of the data and provide clarifications where necessary, thereby ensuring that the emerging interpretations authentically represented their experiences. Additionally, peer debriefing with colleagues was undertaken at key stages of the analysis to enable critical reflection. To promote dependability, an audit trail of all research activities was documented with dates, including interviews, think-alouds, and classroom implementation, to reflect the scale and duration of fieldwork on, and help refine the developing themes.
Finally, to support the robustness of the findings and reinforce their dependability, confirmability, and transferability, contributing to a transparent and rigorous qualitative inquiry triangulation across multiple data sources was employed. Interviews, concurrent think-aloud sessions, and lesson plans, enhanced the credibility, richness, and depth of the analysis (Denzin, 1978; Carter et al., 2014).

Theoretical Framework Integration

This study is based on the Technology Acceptance Model (TAM) (Davis, 1989). In order to capture broader organizational and contextual factors, TAM was extended with constructs from the Unified Theory of Acceptance and Use of Technology (UTAUT), specifically, facilitating conditions, social influence, and leadership style. Pedagogical impact was added from the AI-TPACK framework (Ning et al., 2024) which highlights the interplay between AI-related technological knowledge, teaching strategies, and subject content.
Although not incorporated into the initial coding scheme, the Concerns-Based Adoption Model (CBAM) (Hall & Hord, 1987)proved useful in interpretation. The patterns of teacher usage aligned with CBAM’s Levels of Use providing deeper insight into varying levels of adoption. Thus CBAM was applied only during the interpretive stage and not for coding. Collectively, TAM, selected UTAUT constructs, AI-TPACK, and CBAM, provided a multi-level perspective on the individual, pedagogical, and institutional factors shaping AI use in Cypriot primary schools.

4. Results

Building on emerging patterns of educator’s pedagogical and professional engagement, the following sections explore how educators applied AI tools in both professional and classroom settings. To contextualize the thematic findings, the study first categorized participants’ overall engagement levels with AI, offering a foundation for understanding the breadth and depth of use across the sample. Two primary domains of application were identified through thematic analysis: (1) Pedagogical use of AI as a classroom teaching aid and (2) Professional use of AI to support educators’ own learning and development

4.1. Levels of Use of Artificial Intelligence Tools by Educators

The analysis identified five levels of engagement with AI tools, from non-use to systematic integration. Across both interviews, findings indicate a gradual shift toward more intentional use, with some participants exploring possibilities and others embedding AI into daily practice. Table 1 outlines these levels and provides selected quotes to illustrate progression and variation.
Beyond general engagement levels, educators reported using AI tools in specific pedagogical contexts. These applications are explored in Thematic Unit 1.

4.2. Use of AI Tools by Educators in Their Professional and Pedagogical Practice

4.2.1. Thematic Unit 1: Use of Artificial Intelligence Tools as Teaching Aids

This thematic unit examines how educators employed AI tools to support instructional practices. The data reveal two primary areas of pedagogical application: teaching preparation and the promotion of student creativity.

Use of Artificial Intelligence Tools for Teaching Preparation

The use of AI tools in lesson preparation was categorized into seven key practices, summarized in Table 2. educators primarily employed these tools to generate and differentiate instructional content, simplify texts, produce assessments, and create multimodal or interactive materials. A clear upward trend was observed from the first to the second interview phase, especially in lesson planning and the creation of visual and gamified resources. This evolution highlights the growing perceived value of AI in enhancing both instructional efficiency and content accessibility.

Artificial Intelligence as a Tool for Enhancing Student Creativity: Group and Individual Practices

Educators used AI tools for preparing lessons to support student creativity (Table 3). Based on the survey results these practices are classified into two primary categories: (a) group-based creation and reflection, and (b) individualized, differentiated applications. In a group setting, the teachers integrated tools like ChatGPT and Bing Image Creator in project-based activities for collaborative writing, co-designing images and shared reflection, which enhanced participation, critical thinking and creative expression. In an individual context, the teachers used AI tools for writing, visualising concepts and exploring new ideas because the flexibility and accessibility of the tools made it easier for them to tailor instruction to individual needs, increase student engagement and support students with different abilities or language proficiency levels.
Beyond classroom application, participants also highlighted the role of AI tools in supporting their professional learning, further explored in Thematic Unit 2.

4.2.2. Thematic Unit 2: Use of Artificial Intelligence for Teacher Professional Development

Thematic analysis also revealed five aspects of how educators engaged with AI tools for self-directed learning, instructional support, and reflective practice. Subthemes were inductively identified as related to:
Exploration and Skill Development which occurred as all 15 educators used AI to explore and develop practical skills through trial and error experimentation.
On-Demand support was provided, as AI acted like a 24 hour available assistant for retrieving information, creating resources, and planning lessons, especially under time pressure.
Personalisation of the exploration and adaptation of AI to suit individual learners, lesson objectives and language levels, demonstrating increasing confidence in differentiation, illustrating a growing confidence in differentiated instruction.
Interactivity and dialogic practices of educators with the AI tools. They engaged in conversations with chatbots, created activities, such as quizzes, and wrote prompts for game-based tasks, in their effort to facilitate student interaction and agency.
Critical reflection and metacognition of educators regarding the benefits and risks of using AI in their teaching. They posed questions about ethics, accuracy and the longer-term implications for the teaching profession, because they were actively considering the potential consequences of AI integration.
Table 4. AI Tools as a Means of Teacher Professional Development.
Table 4. AI Tools as a Means of Teacher Professional Development.
Subcategory Examples
Exploration, Experimentation, Skill Development “I’ve tried a bit to ask the AI to simplify some texts. But that’s it—not too much.” (Int1P2)
“So, today I’ll try to use ChatGPT to give me ideas for the first day of school after the holiday break.” (TAP2P3)
“I feel that I explored some tools specifically designed for school use, like Magic School, Canva, and ChatGPT.” (Int2P9)
On-Demand Assistance and Content Generation “When I started, the first text it gave me was very close to what I had in mind, and I worked on it, revised it, and in the end, I managed to get exactly what I wanted. It saved me a lot of time.” (Int1P24)
“I want it to include both open- and closed-ended questions, creative activities, questions that promote critical thinking, and differentiated tasks.” (TAP1P19)
“And after I wrote, I asked [ChatGPT] to generate a text so we could compare it with the students’ own. I also asked them to provide the comparison, and for me, that was really helpful.” (Int2P14)
Personalized Exploration and Tool Adaptation “Even though I’m not the kind of person who won’t try something out of fear that something might go wrong… I will try it, but with caution.” (Int1P14)
“I’d say it’s interesting. You can even add music. Okay. Add [adds music]. Let’s see, is it here? [searches for music] Oh, it also has a cursor, what does that help us do?” (TAP3P17)
“It’s the same for us educators, this is something new for us too. There’s been an explosion of this thing in recent years, and for many of us it’s unfamiliar. We also need some guidance.” (Int2P19)
Interactive and Dialogic Practices “Yes, I believe we need to prepare the students, and they won’t be prepared unless they use these tools and learn to critique them.” (Int1P9)
“Let’s say thank you to it as well. [Laughs and reads] You’re welcome. [comments] ChatGPT is polite too. Perfect.” (TAP1P11)
“I’ve already suggested several tools to my colleagues, and we said we’d sit down in a meeting to go over them.” (Int2P2)
Critical Reflection & Metacognition “I believe they will be useful once we learn how to use them properly, so that they help us instead of wasting our time.” (Int1P3)
“Basically, I didn’t manage to do what I wanted. I’m closing.” (TAP3P5)
“So, it’s really important to know what you’re asking it to generate, to give the right prompts, let’s say, so it produces what you actually want. You shouldn’t just accept whatever it gives you as-is.” (Int2P7)
While participants expressed enthusiasm for exploring AI tools and acknowledged their potential in both classroom practice and professional learning, the study also surfaced several critical barriers to meaningful integration. The following section presents the key challenges and concerns identified during the interviews, ranging from systemic limitations to ethical and pedagogical dilemmas.

4.3. Barriers to the Implementation of AI Tools in the Classroom

This section presents key challenges identified during the second round of interviews, following educators’ classroom implementation of AI tools. Participants consistently emphasized the lack of a systemic and institutional support structure for AI integration in schools.

4.3.1. Thematic Area 1: Lack of Systemic and Organizational Support

Systemic and Organizational Gaps in Support Frameworks

Participants (n=13) consistently emphasised the lack of policies, official guidelines, and instructional resources on technology use provided by the Ministry of Education. As one participant observed, “Right now, the way things are, it all depends on the goodwill of each school and each teacher for these matters to actually work in schools” (Int2P4).
In addition, they pointed to insufficient time within the school timetable to experiment with or embed new technologies. As one participant reflected, “And of course, they’ll need to reduce the curriculum load.” (Int2P10). Furthermore, the absence of dedicated curriculum space for students’ digital skills development was highlighted as a significant obstacle to meaningful technology integration:“They still don’t have the skills for me to say, ‘OK, save it to your USB.” (Int2P14).
Concerns were also raised about the lack of ethical guidelines, data privacy protocols, and formal mechanisms for monitoring digital practices: “[...] how far I can allow students to use artificial intelligence. I mean, for someone who doesn’t want to take risks in life, there’s no legal framework in place either” (Int2P9). The use of outdated teaching materials and lack of account management systems and data infrastructure were further noted as significant barriers to effective AI adoption.
Table 5. Systemic and Organizational Gaps in Support Frameworks.
Table 5. Systemic and Organizational Gaps in Support Frameworks.
Thematic Subcategory Description Examples
Lack of Parent Information and Involvement Schools and the Ministry provide no communication to parents about AI use. “The Ministry also needs to have convinced the parents that they should accept these tools.” (Int2P11)
Lack of Policy and Official Guidelines No formal policies, circulars, or resources from the Ministry of Education. “Right now, the way things are, it all depends on the goodwill of each school and each teacher for these matters to actually work in schools.” (Int2P4)
Lack of Structures for Student Digital Competency Development Limited infrastructure or policy to support students’ digital skill development. “They still don’t have the skills for me to say, ‘OK, save it to your USB.” (Int2P14)
Absence of Incentives or Recognition for AI Adoption No incentives or recognition for educators integrating AI into their teaching. “The most important thing, Maria, is that I, as a person, want to do it. If I want to do it, if I say I want to do it, then it’ll work out. If I don’t want to, for whatever reason, I won’t do it. It won’t work. So, it’s motivation. It’s all about motivation.” (Int2P24)
Time Constraints and Curriculum Pressure Rigid timetables leave little room to try new tools. “And of course, they’ll need to reduce the curriculum load.” (Int2P10)
Lack of Account Management Systems and Data Infrastructure No systems or accounts in place to support access to AI platforms. “I’d like the accounts to be ready in advance, so that the teacher doesn’t have to set them up.” (Int2P11)
Lack of Evaluation or Monitoring of AI Use No systems to track or assess the impact of AI tools. “Now, if the inspectors come and request it, and even impose it, let’s say, by asking to see a lesson conducted in the lab, I believe the work will not be done properly. There might be a slight shift, but it will not be substantial.” (Int2P9)
Absence of a Legal or Regulatory Framework Missing legal and ethical guidelines for safe AI use in education. “[...] how far I can allow students to use artificial intelligence. I mean, for someone who doesn’t want to take risks in life, there’s no legal framework in place either.” (Int2P9)
Lack of Curriculum Time for Digital Skills Digital skills are not included in the official curriculum. “Should the curriculum change? Should AI be introduced? We’re going to have a lot of work.” (Int2P11)
Outdated Teaching Materials Existing materials fail to reflect current technology use. “They want, let’s say, within the curriculum or the lessons themselves, to have an accompanying handbook for the teacher.” (Int2P17)
Note: Participant codes refer to responses from the second semi-structured interview.
These systemic gaps often intersected with practical limitations in infrastructure and access, as described in the following section.

Insufficient Infrastructure and Restricted Access to AI Tools

In the interviews, the majority of participants (12 out of 15) explicitly stated that they still faced a lack of digital infrastructure and limited access to essential tools and technologies. They reported unequal or inadequate access to equipment (Int2P11), lack of access to devices due to institutional policies (Int2P2), and challenges in using specialized AI tools because of licensing or technical limitations (Int2P15).
Table 6. Insufficient Infrastructure and Restricted Access to AI Tools.
Table 6. Insufficient Infrastructure and Restricted Access to AI Tools.
Subcategory Description Examples
Lack of Technological and Support Structures Unequal or inadequate technical infrastructure “If only we had, I don’t know… okay… a device for each student, a screen? Yes, [so they have tablets], or a computer lab.” (Int2P11)
Institutional Barriers to Equipment Access Restrictions due to funding shortages, bureaucracy, or procedural delays “The computer lab is currently occupied, it’s being used as a classroom.” (Int2P2)
Inequities and Lack of Access to Licensed Tools Limited access to premium tools due to licensing costs or restrictions “You can’t expect a teacher to move forward and be an educator of tomorrow, and at the same time clip their wings. You need to give them incentives, support them. So just unlock the subscriptions, surely that would cost less overall. A teacher shouldn’t have to pay 20 here and 30 there. Why?” (Int2P15)
Note: Participant quotes are drawn from the second round of semi-structured interviews. All subcategories were derived through inductive thematic coding.
These limitations were further compounded by the broader absence of systemic integration of digital technologies into day-to-day educational practices, as explored in the following section.

Absence of Institutional Integration of Technologies into Educational Culture and Practice

All fifteen participants highlighted the absence of a supportive institutional culture or framework to promote the adoption of AI in teaching practice. Eight educators, in particular, described limited support or interest from school leadership, suggesting an overall administrative reluctance toward educational innovation: “The school administration didn’t get involved with me, neither if I did it, nor if I didn’t.” (Int2P9). Twelve participants reported the absence of technical and infrastructural support at the school level as reflected in one teacher’s account: “…that I actually have to go to the principal’s office to get the tablets, charge them, take them home, set them up myself, and make sure no other teacher wants them, which happened, as some didn’t. I also had to prepare the students in advance, because they had never used tablets at school before.” (Int2P2).
Moreover, eleven participants pointed out that digital technologies are still not embedded within the pedagogical culture of schools. In parallel, thirteen educators referred to insufficient time for lesson preparation using new tools: “[...] if I don’t know a tool, it will take up time, which I don’t have, so I have to take that time from somewhere else in order to use it, to learn it.” (Int2P21). The absence or inadequacy of institutionalised support roles, such as technology coordinators or mentors, was also reported by ten educators, underscoring the lack of sustainable support mechanisms.
Table 7. Absence of Institutional Integration of Technologies into Educational Culture and Practice.
Table 7. Absence of Institutional Integration of Technologies into Educational Culture and Practice.
Subcategory Description Examples
Absence of Institutional Integration of Technologies into Educational Culture and Practice Lack of a structured institutional framework and school-wide culture supporting AI or digital tool adoption “No [we don’t have a school policy for the use of technology].” (Int2P9)
Lack of Interest or Support from School Leadership Limited engagement, encouragement, or involvement from school leaders in promoting technology-enhanced teaching “The school administration didn’t get involved with me, neither if I did it, nor if I didn’t.” (Int2P9)
Lack of Technological and Support Structures at the School Level Absence of infrastructure such as computer labs, technical staff, or ICT support in daily school operations “…that I actually have to go to the principal’s office to get the tablets, charge them, take them home, set them up myself, and make sure no other teacher wants them, which happened, as some didn’t. I also had to prepare the students in advance, because they had never used tablets at school before.” (Int2P2)
Lack of Integration of Technologies into School Culture and Practice Technologies not embedded into regular teaching routines, planning, or school-wide initiatives “If I had a colleague like you, Maria, who could excite me, who would say ‘you know, we’re going to start this thing, try something new,’ I would dare to do it. But I’m on my own.” (Int2P15)
Lack of Available Personal Time for Collaboration, Experimentation, and Professional Development educators lack designated or flexible time during the workday for co-planning, digital experimentation, or skill-building “…but I need time to try out the other ones too. They seem really interesting, especially the one that makes videos and... I just don’t have time.” (Int2P14)
Lack of Personal Time for Lesson Preparation educators are time-constrained due to workload and find it difficult to prepare lessons using new digital tools “[...] if I don’t know a tool, it will take up time, which I don’t have, so I have to take that time from somewhere else in order to use it, to learn it.” (Int2P21)
Absence or Inadequacy of Institutionalized Support Roles in Schools Lack of formal roles such as digital mentors, technology coaches, or coordinators to support implementation “They should select educators who are actively involved, train them properly, and then send one of them to each school, but they need to actually work, not just sit around for two hours, while someone else who doesn’t even have those hours ends up knowing more than them.” (Int2P10)
Note: Participant codes refer to responses from the second round of semi-structured interviews unless otherwise indicated. Quotes are translated from Greek and lightly edited for clarity. Examples shown are indicative and reflect patterns identified through thematic analysis.

Inadequate Functional, Experiential Training and Support for Educators

A recurring theme emerging from participants’ interviews was the limited availability of meaningful and practical training to support the integration of Artificial Intelligence (AI) into teaching. They expressed frustration over the absence of structured, sustained, and classroom-relevant training opportunities. As one participant explained, “I believe that, besides the seminars, they should also train us in a more targeted way, anyway…” (Int2P3).
Eleven participants highlighted the lack of trainers with appropriate technical or pedagogical expertise, as well as the absence of guidance mechanisms within or beyond the school environment. One educator remarked, “There should be someone you can turn to at any time. I think in secondary schools there’s a teacher responsible for the computers who’s always available to help, at least that’s what I hear from my friends. We don’t have that kind of support in primary schools.”(Int2P19).
Participants further emphasised the importance of practical, experience-based learning opportunities that are rooted in authentic classroom contexts. As one teacher reflected, “The programs the Ministry offered this year on the topic were more like presentations. This presentation style doesn’t really get you to… I believe that if… I don’t know how you came into my path. I think if I hadn’t come across you, I’d still be watching YouTube videos.” (Int2P9). Finally, twelve participants drew attention to the absence of clear pedagogical direction for the meaningful use of AI in teaching. As one educator noted, “And in terms of how to use it in the classroom. We definitely need both guidance and some practical strategies” (Int2P5).
Table 8. Inadequate Functional, Experiential Training and Support for Educators.
Table 8. Inadequate Functional, Experiential Training and Support for Educators.
Thematic Subcategory Examples
Lack of systematic and functional training “I believe that, besides the seminars, they should also train us in a more targeted way, anyway…” (Int2P3)
“Well, first of all, the Ministry itself needs to take the initiative and provide training. Because right now, those of us who are using these tools, apart from ChatGPT, which I believe more or less everyone is using, these more…” (Int2P7)
Lack of specialized trainers and guidance structures “There should be someone you can turn to at any time. I think in secondary schools there’s a teacher responsible for the computers who’s always available to help, at least that’s what I hear from my friends. We don’t have that kind of support in primary schools.”(Int2P19)
Need for experiential, targeted, and guided training “The programs the Ministry offered this year on the topic were more like presentations. This presentation style doesn’t really get you to… I believe that if… I don’t know how you came into my path. I think if I hadn’t come across you, I’d still be watching YouTube videos.” (Int2P9)
Lack of pedagogical guidance for technology integration “And in terms of how to use it in the classroom. We definitely need both guidance and some practical strategies.” (Int2P5)
Note: Participant codes refer to responses from the second round of semi-structured interviews. All quotes have been translated from Greek and lightly edited for clarity. The examples presented are illustrative and reflect recurring patterns identified through thematic analysis.

4.3.2. Thematic Area 2: Ethical and Pedagogical Concerns of Educators

Ethical and pedagogical concerns related to the integration of Artificial Intelligence (AI) into classroom practice were also evident in participants’ narratives. These concerns, which emerged during the second round of interviews, were grouped into five key thematic areas, as summarized in Table 9. One of the main concerns related to the protection of personal data and student safety. As one participant questioned, “Yes, or if we’re going to use photos and things that will be in the classroom again. I think, I’m not sure, are we covered in all aspects?” (Int2P3).
Ten educators also voiced apprehension that increasing reliance on AI might undermine the teacher’s professional role (Int2P4; Int2P9). They expressed concern that automation might reduce the depth of learning by encouraging students to depend on AI-generated answers rather than developing independent cognitive and problem-solving skills. Furthermore, almost all participants highlighted the risk that frequent use of AI could erode students’ critical thinking skills and the authenticity of their work. As one participant cautioned, “You need to be careful to check that what it tells you is actually correct, you have to test it first. It shouldn’t be like, I typed it in, it gave me something, and that’s it.” (Int2P7).
Participants also noted resistance from parents, particularly in conservative or religious communities. As one educator explained, “Also, the Ministry needs to have convinced the parents that they should accept these tools, so that we don’t have to deal with parental backlash.” (Int2P11). In addition, participants highlighted significant disparities in access to AI tools and infrastructure across schools. One teacher emphasised the need for greater equity: “From the start, it needs to ensure the strengthening of these networks, labs, tablets, there needs to be, in any case, some equality among schools when it comes to this technological matter” (Int2P2).

5. Discussion

This section discusses the study’s findings in relation to the thematic categories presented earlier, highlighting how Cypriot primary school teachers engaged with AI tools and how their practices evolved over time.

5.1. Levels of Educators’ Engagement with Artificial Intelligence Tools

The pattern of AI tool use among educators was clear and aligned with the “Levels of Use” framework from the Concerns-Based Adoption Model (CBAM) (Hall & Hord, 1987). Initially, many participants were at the Orientation or Preparation stages, characterized by limited exposure and initial experimentation (Int1P2; Int1P3; Int1P13). One participant admitted: “I tried, but I didn’t use them in class. I played around a bit” (Int1P19). These early engagement patterns are consistent with studies from Malaysia (Paramasveran & Nasri, 2018) and Jordan (Gasaymeh, 2017), where educators’ lack of awareness and institutional support meant that most teachers were unable to move beyond the exploratory stage.
Findings by the second round of interviews revealed that many had progressed to the Potential Integration and Systematic Use stages, demonstrating more deliberate and sustained use of AI tools. One teacher explained, “I definitely use them... especially in my preparation, it helps a lot” (Int2P3), and another added, “Now that I’ve learned about them, of course I’ll continue using them” (Int2P14), reflecting her satisfaction. A third example illustrates how teachers began using AI for targeted instructional purposes: “I created texts in ChatGPT with specific instructions and literary style, it helped my students tremendously” (Int1P24). This upward shift supports findings by Kimmons et al. (2020), who argue that structured support and experimentation opportunities positively influence meaningful technological integration. It also resonates with the PICRAT model, which describes a transition from passive, substitution-based uses of technology toward more transformative, student-centered uses of technology. Evidence of this shift was seen in examples where educators involved their students in AI-related activities: “We had our first class activity with AI tools. The students were excited!” (Int1P10).
These findings are further supported by the AI-TPACK framework (Ning et al., 2024), illustrating a progression from basic technological knowledge (AI-TK) to fully contextualized integration (AI-TPACK). Relatively, one participant described using AI beyond classroom contexts: “I use them for my academic work, not just the classroom” (Int2P21), while another reflected on the need for balance between reliance and critical thinking: “It’s in my life now, I rely on it, though I try not to forget how to think critically myself” (Int2P7). These usage patterns resonate with Rogers’ Diffusion of Innovations Theory (Rogers et al., 2008), which emphasizes the role peer learning, trialability, and perceived value in technology adoption. Another educator explained: “I want to explore more… when I feel safe and have time, I’ll use them more” (Int2P2), illustrating how contextual factors, such as available time, perceived safety, and resource constraints impact their willingness to engage with AI (Int2P2; Int2P15).
Finally, these findings are also consistent with Cyprus’s position in the DSI 2.0 index (Vuorikari et al., 2022), reflecting a moderate level of digital competence and underscoring the ongoing need for targeted professional development and policy support. Comparisons with countries such as Estonia and Finland, suggest that sustained investment in training ecosystems and well-designed policy frameworks can accelerate AI integration into classroom practice. However, although grassroots experimentation is a valuable driver of innovation, systemic support remains essential for scaling and sustaining meaningful use.

5.2. Uses of AI Tools by Educators in Their Professional and Teaching Practice

The analysis revealed a diverse and evolving range of ways in which educators engaged with AI tools across their professional and teaching practice. Participants described using AI tools such as ChatGPT, Gemini, and MagicSchool primarily for lesson preparation including tasks such as content creation and differentiation to support diverse students’ needs. Many simplified texts, generated grade-appropriate content, and produced tailored materials practices that align with principles of inclusive and differentiated instruction (Guilbault et al., 2025; Zagami, 2024). As one participant shared, “I can tell the artificial intelligence to create a text that would fit, let’s say, fifth grade or first or second grade, and that includes a lot of adjectives.” (Int1P4). Others, used AI tools to create structured paragraphs, vocabulary activities, and simplified exercises for students with specific needs (Int2P13; Int1P17; Int2P19). However, concerns about oversimplification, inaccuracies in AI outputs, potential overreliance on AI, highlighted ongoing tensions between efficiency and pedagogical depth (Elstad, 2024; Freyer et al., 2024; Guidroz et al., 2025).
AI tools were also adopted for formative assessment and resource development. Participants reported using platforms like MagicSchool, Canva, and QuestionWell to generate quizzes, comprehension exercises, and assessment rubrics. They created multiple-choice questions (TAP3P3), printable worksheets (Int2P10), and personalised assessments for learners with difficulties (Int2P15). Yet, questions regarding the appropriateness and quality of AI-generated content, especially in subjects like mathematics (Int2P24), echo broader concerns about automated assessment (Jack et al., 2024; Mogavi et al., 2022).
Beyond preparation and assessment some educators explored gamified approaches using tools like Kahoot, Quizizz AI, and Quizlet to enhance engagement through interactive formats. They designed digital escape rooms, comprehension games, and storytelling tasks (Int2P10; Int2P13). While these strategies often boosted motivation (Zawacki-Richter et al., 2019), some educators cautioned that excessive competitiveness could undermine deeper learning processes (Int2P19).
AI also enabled multimodal and creative instruction with tools such as Canva, Pictory AI, and InVideo be used to develop storyboards, animations, and comics for subjects including science and writing (TAP1P10; TAP3P7). These practices align with Mayer’s (2009) multimedia learning theory, and support the idea of AI being a pedagogical co-designer (Haase & Hanel, 2023). Nevertheless, effective integration of such practices depended on educators’ digital literacy and media awareness (Tan et al., 2025), suggesting the importance of ongoing professional learning.
AI-assisted lesson planning also emerged as a key theme. Participants used AI to generate activity ideas, align content with the outdated curriculum, often streamlining preparation processes (TAP1P19; Int2P3). While this automation helped their preparation, some participants expressed concern that it might erode reflective, pedagogically grounded planning processes (Perrotta & Selwyn, 2020), highlighting an ongoing challenge between efficiency and in depth instructional design.
Importantly, collaborative and creative student-centred applications were also evident. Educators co-developed digital content with their students, transforming written texts into videos, illustrations, or songs using tools like Bing Image Creator and Suno (Int2P3; TAP1P4). Students contributed prompts and design ideas (Int2P10), fostering agency and participation aligned with student-centered pedagogies (Craft, 2003; Griva, 2024; Zha et al., 2024). At the same time, required educators to possess strong digital competencies and engage critically with technology (Holmes et al., 2019). Student-led projects ranging from writing and visual design to multimedia production demonstrated AI’s potential to support creative expression while reducing fear of failure (Int2P5; Int1P4). However, participants also raised concerns about students’ increasing dependence on AI and emphasised the importance of integrating media literacy into teaching (Zhai et al., 2024).
Professional development emerged as another central dimension of teachers’ engagement with AI. Many participants described self-directed experimentation with tools like ChatGPT, Canva, and Suno, reflecting experiential learning patterns (Kolb, 2014), and intrinsic motivation. These findings support prior research on teacher-led adaptation to emerging technologies (Çelik & Baturay, 2024), highlighting educators’ agency in shaping their professional growth. They generated visuals, simplified content, or brainstormed activities (Int2P2; TAP1P5). While these practices enhanced their efficiency and responsiveness, participants cautioned that such uses risk superficial engagement if not embedded within reflective and pedagogically grounded approaches (Tan et al., 2025; Zhai et al., 2024). At the same time participants stressed the value of personalised support, informal mentorship, and collaborative learning communities in reducing anxiety and building confidence (Int2P2; Int1P3). Adaptive AI prompts and feedback were valued for their ability to align with individual teaching styles (Int2P4; Int2P7). Such practices reflect wider evidence that personalised guidance and social learning enhance sustainable technology integration and teacher agency (Nguyen et al., 2025; Tobin et al., 2024).
Finally, critical reflection surfaced as an integral part of educators’ engagement with AI. Participants raised ethical concerns, expressed fears of possible professional deskilling, and emphasized the need for protection of creativity and human judgment in an era of increasing automation (Int2P7; Int1P4). In this sense, AI served not only as a pedagogical tool but also as a trigger for deeper inquiry, reinforcing the need for ethical, and reflective practice in digital education (Pangrazio & Selwyn, 2021). Ensuring that that innovation is sustainable and scalable, will require systemic investment in professional development, infrastructure, and digital equity (Schophuizen & Kalz, 2020).

5.3. Barriers to the Implementation of AI Tools in the Classroom

Despite educators’ growing interest in AI, while implementation they faced a range of interconnected barriers -systemic, pedagogical, infrastructural, and cultural- similar to those reported internationally (Holmes et al., 2022; Ghimire & Edwards, 2024; Selwyn et al., 2020). These constraints shaped not only how educators integrated AI into their teaching but also how sustainable such innovation will be over time.
Lack of Policy and Institutional Guidance: A recurring theme across participants’ interviews was the absence of national policy or strategic direction regarding AI integration. Educators consistently reported operating without clear frameworks, ethical protocols, or ministry support. As one participant noted, “Right now, everything depends on the goodwill of each school, of each teacher…” (Int2P4). Another stated, “We’re on our own. No training, no materials” (Int2P5). Several questioned the Ministry’s position altogether: “We don’t even know if the Ministry has ethical concerns… does it allow us? To what extent?” (Int2P11). These findings, reflect Holmes et al.’s (2022), who warn that without institutional safeguards, AI integration remains fragmented, inconsistent and reliant on individual initiative rather than systemic planning. This policy gap also intensified disparities between schools and amplified educators’ sense of isolation.
Moreover, they align with “The Cyprus 2025 Digital Decade Country Report” (Digital Decade, 2025) which highlights that, although the country has strong gigabit connectivity and extensive 5G coverage, only 49.46% of the population possesses at least basic digital skills. This figure is below the EU average of 56%. Furthermore, the “Cyprus Education Profiles 2021 report”, notes that although the 2020 Strategy for the Digital Transformation of Education and the 2020 Policy on Digital Education, their implementation, particularly regarding school infrastructure, remains uneven and without focus on AI.
Parental Concerns and Community Resistance: Cultural and religious sensitivities presented further barriers, particularly when parental skepticism was left unaddressed (Int2P3; Int2P11; Int2P19). Many participants argued that clearer ministry- led communication is essential to build trust and legitimacy around AI use. These concerns echo Selwyn et al. (2020), who highlight how institutional non support frequently leaves educators navigating alone community reactions and concerns.
Infrastructure and Access Inequalities: Significant disparities in infrastructure and access further limited AI adoption. Some schools had functional labs, while others lacked even basic resources such as projectors, Wi-Fi, or reliable devices (Int2P2; Int2P5; Int2P15; Int2P20). Educators often described improvising with outdated equipment or bringing on their own the devices in their class: “I have to carry the laptops upstairs myself, and sometimes there’s no internet or battery” (Int2P10). Another educator described stark inconsistencies: “A teacher moves from a school with full tablets to another with only five, or the opposite. There should be equality with funds and support” (Int2P2). Such inequalities contradict OECD (2021) and Unesco (2025) recommendations, that emphasize equitable infrastructure investment is a prerequisite for meaningful technology integration.
Licensing and Subscription Barriers: Access to digital tools was often restricted to fees and subscription models, forcing educators to rely on limited free versions or pay out of pocket (Int2P2; Int2P3; Int2P13; Int2P15). As one educator noted, “I started creating something, but I couldn’t finish it because the free trial ran out” (Int2P13). Another voiced frustration over the financial burden: “It’s not sustainable to expect educators to pay €20 here and €30 there. If you want us to be tomorrow’s educators, don’t clip our wings, give us support” (Int2P15). These financial constraints reinforce UNESCO’s (2023) concerns that commercialised EdTech ecosystems exacerbate inequality by restricting access to advanced features.
Student Digital Literacy Gaps: Contrary to the common narrative about younger students being “digital natives”, many struggled with basic digital skills such as typing or logging in (Int2P9; Int2P14; Int2P21). This finding aligns with research highlighting that uneven digital competence can limit the effective use of AI-enhanced activities (Pitts et al., 2025; Feng & Xue, 2023). This forces teachers to devote additional instructional time to foundational skills before meaningful AI integration is possible.
Invisible Labor and Lack of Incentives: Educators also reported investing significant time beyond working hours to explore tools and prepare materials (Int2P2; Int2P7; Int2P13; Int2P15; Int2P19). In the absence of institutional recognition or incentives, such efforts were often perceived as unsustainable echoing calls for structural support such as time allowances, professional credits, or compensation (Alam et al., 2025; Ding et al., 2024).
Time Constraints and Curriculum Misalignment: Time pressures and rigid curricula further limited opportunities for experimentation. Many teachers described a lack of flexibility to experiment with AI tools or integrate new pedagogical approaches (Int2P3; Int2P5; Int2P19). One reported insufficient time to integrate new tools meaningfully: “The Ministry should reduce the curriculum first. There’s not enough time to cover what’s in the books, let alone extra things” (Int2P3). These challenges echo critiques from L. Huang (2023) and Holmes et al. (2022), who argue that meaningful innovation requires curriculum reform and dedicated space for experimentation.
Technical and Administrative Burdens: A major obstacle reported by participants involved the technical and administrative workload associated with the AI tool use. Educators were often responsible for managing accounts, troubleshooting platforms, and ensuring data privacy, tasks that diverted their time and energy from teaching (Int2P7; Int2P9; Int2P17). One educator shared, “I spent unbelievable hours just creating the accounts” (Int2P7), while another expressed discomfort using personal email addresses for school-related platforms (Int2P9). These findings highlight persistent governance gaps and show how, in the absence of systemic support, educators shoulder the administrative weight of technology adoption (Holmes et al., 2022).
Absence of Support Structures in Schools: Many educators frequently described working in isolation, with limited pedagogical or technical support available within their schools (Int2P3; Int2P5; Int2P9; Int2P19). Even when ICT coordinators were present, their expertise or willingness to collaborate was inconsistent: “I asked the ICT coordinator, but they didn’t have the expertise, or the willingness to figure it out with me” (Int2P9). Routine tasks like logging in or connecting devices consumed valuable teaching time (Int2P4; Int2P15). Participants strongly advocated for introduction of in-school advisors or mentors: “There should be a consultant at the school, someone we can call when we want to use the computers” (Int2P3). This gap was particularly pronounced in primary schools, where formal technical support roles are rarely established (Int2P19; Int2P21).
Opportunities for peer collaboration were also limited. “Only one or two colleagues actually use the laptops, even though they were bought for everyone” (Int2P15). Some reported receiving discouraging feedback from peers, such as “Don’t bother, you won’t finish the curriculum,’ or ‘Stay in your lane’” (Int2P20). Another participant reflected on the impact of collaboration: “If I had a colleague like you, Maria, to excite me into starting something new, I would dare. But I’m alone” (Int2P15). These statements align with Ertmer and Ottenbreit-Leftwich (2010), who argue that successful technology integration depends on both individual agency and institutional support.
Fragmented Culture and Leadership Gaps: School leadership emerged as another decisive factor shaping AI adoption. While some school leaders actively enabled innovation by allocating time, resources, or external training opportunities, “Our principal gives us complete freedom to experiment” (Int2P17). Others were indifferent: “T6hey didn’t get involved, whether I did it or not didn’t matter” (Int2P9).
Participants emphasized the value of collaborative learning cultures. Suggestions included designating experienced educators to train peers: “Train a few educators and assign one to each school, who will actually work, not just sit in the office” (Int2P10), and organise brief weekly sessions to share tools such as Canva (Int2P14). However, a lack of shared vision and leadership cohesion often discouraged such initiatives, increasing the risk of burnout, a concern echoed by Harris and Jones (2019).

Educators’ Ethical and Pedagogical Concerns Regarding AI Use

The integration of Artificial Intelligence (AI) tools into primary education was approached by participants with caution. While many of them acknowledged AI’s potential to enhance creativity, support content differentiation, and increase student engagement, their reflections revealed ethical and pedagogical concerns. From their reflections four key themes were revealed: data protection, professional identity, authenticity of learning, and sociocultural resistance.
Privacy and Data Protection: Participants’ most frequent concerns related to privacy and data protection. They expressed discomfort with platforms that required student emails, photographs, or even their own mobile numbers. One participant explained: “I felt unsafe… it asked for my phone number, and I didn’t want to give it” (Int2P9). The absence of institutional guidance, particularly from the Ministry of Education, contributed to their uncertainty about their legal responsibilities and potential risks. They feel they are left to navigate complex issues of privacy, consent, and safety on their own. Another participant stressed the need for clear rules: “We need a legal framework to stay at this level and not let things get out of hand” (Int2P4) reflecting broader calls in the literature for robust, AI ethics policies for education (Holmes et al., 2022).
Professional Identity and the Role of the Educator: Another concern related to AI is the possibility of undermining the professional role of educators. Participants expressed fears of reduced student effort and the potential deskilling of educators were recurrent themes. As one participant noted, “It’s a bit scary to think our role might be obsolete in a few years” (Int2P10) while another asked “Will students even need to try anymore?” (Int2P4). Such reflections align with broader debates about the changing nature of teacher expertise in AI- enhanced classrooms (D. T. K. Ng et al., 2022).
Authenticity of Learning and Pedagogical Intent: An emphasis was given by educators to the fact that AI could be a powerful tool for fostering critical thinking and deeper learning when used with clear pedagogical intent. Some described using AI to model inquiry, stimulate imagination, and encourage perspective- taking. One teacher explained, If we teach them to use it properly, it can develop their imagination and metacognition” (Int2P2). This dual perspective, viewing AI as both a risk and a resource, reflects findings by Ifenthaler et al. (2024), who argue that AI use outcomes depend on teacher mediation and thoughtful task design.
Sociocultural Resistance and Community Dynamics: Last but also significant, sociocultural resistance, particularly from parents, also emerged as a barrier and a concern. Several educators reported parental hesitations, often rooted in conservative or religious beliefs, about the use of AI in schools. “Some parents don’t want their kids exposed to this at all” (Int2P19), one educator explained. Others emphasized the importance for institutional communication to build trust: “We need to convince parents first” (Int2P11). These concerns align with Selwyn et al. (2021), who highlight the importance of community context in shaping the acceptance of digital innovations. Without transparent communication and inclusive dialogue, AI risks being perceived as externally imposed rather than educationally purposeful.

6. Framework for Pedagogical Integration of AI in Primary Education

Our research findings suggest a five-pillar model for implementing AI in Cypriot primary education in a meaningful, equitable and pedagogical way. Each pillar represents a necessary dimension that will allow for sustained long-term implementation:
  • Clear policy and governance where The Ministry of Education should develop and communicate clear ethical, legal and curricular guidelines for the use of AI in schools. This policy should include provisions for the protection of students’ data, the roles and responsibilities of stakeholders, as well as clear procedures for the selection, implementation, monitoring and evaluation of AI tools. A strong regulatory framework will provide reassurance for educators and ensure that AI tools are used in a safe, responsible and beneficial manner to support learning.
  • Equity of access and infrastructure in all schools, as is of paramount importance, for equitable use of AI. Schools need reliable internet connection, sufficient hardware, up-to-date devices, accessible platforms to all students, regardless of socioeconomic background, geographical location, and leadership style.
  • Funded resources and support are also necessary. Therefore the Ministry of Education should fund licensed AI tools, for all schools, and protect time in the timetable for educators to experiment, co-plan and explore digital tools. Moreover, the appointment and empowerment of digital coordinators in schools will support day-to-day implementation, alleviate the burden on individual educators, and ensure a coherent approach across classrooms. Furthermore, ongoing professional learning should be continuous, practice-based and situated in real classroom work rather than delivered as one-off workshops. Training training should develop educators’ practical skills, AI literacy, ethical awareness and pedagogical integration through authentic scenarios. Peer mentoring, professional learning communities and structured reflection will help teachers incrementally build their confidence and expertise over time.
  • Pedagogically aligned classroom integration is essential, since AI should be integrated in cross-curricular activities, such as writing, visual communication, language development. Educators should be supported to adopt student-centered, constructivist approaches that encourage critical thinking and creativity, and to make this actionable, provide open access lesson templates, adaptable resources and concrete examples of effective classroom use.
  • Community trust and engagement are crucial, because successful integration requires trust and collaboration that extends beyond the classroom, therefore the Ministry should engage parents, caregivers and local communities in open dialogue about the role and purpose of AI in education and proactively address ethical and cultural concerns. This will help to build shared understanding and ownership, smooth implementation, and sustain momentum.

7. Limitations of the Study

This study has several limitations that should be acknowledged. First, it involved a small sample, 15 primary school educators in Cyprus, and therefore the findings cannot be generalised to the wider teaching population. All participants were experienced having more than ten years of teaching which may limit the transferability of the findings to newcoming teachers, who may have different attitudes, confidence and practices when integrating AI in their teaching.
The data collection period was relatively short (November 2024-April 2025) and may have limited the opportunity to capture long-term developments or sustained changes in practice. The study focused mainly on teachers’ perspectives of their first steps on integrating AI tools into classroom practice. It does not include the perspectives of students or school leaders, and it did not include observations in classrooms, which limited the opportunities for triangulation and the institutional perspective. The study did not aim to compare the effectiveness of specific AI tools, however, it offers a timely snapshot of teachers’ early engagement with AI during a critical transitional phase, just before the formal implementation of the national AI strategy of Cyprus, and provides useful insights into the early adoption processes.

8. Recommendations for Future Research

Future studies should adopt long-term designs to examine how the integration of AI in teaching develops over time, as policies are enacted, curricula evolve, and systems mature. The evidence base should be expanded to include students’ perspectives, classroom observations, and learning outcomes, to reveal how AI affects student engagement, creativity, critical thinking, and achievement, as well as teacher practice, and consequently, school leaders’ and policymakers’ perspectives should be incorporated to explore how culture, governance, infrastructure, and support shape the adoption and sustainability of AI.
A multistakeholder perspective will provide insights to strengthen policy design and inform system-wide innovation, moreover, cross-national or regional comparisons should be made to explore how cultural context, infrastructure, and policy environments shape AI integration. Mixed-methods or experimental designs should be employed to provide a more robust evaluation of AI tools and their pedagogical impact. Therefore, the potential of teacher education and professional development to develop AI literacy, including ethical reasoning, reflective practice, and instructional design, should be investigated, and not just the content of training, but how it translates into classroom practice, to support responsible scaling. Conducting tool-specific studies of platforms like ChatGPT, Gemini, and adaptive tutoring systems will identify their affordances, usability, and limitations across age groups and subject domains, because such evidence can inform procurement decisions, training priorities, and practical implementation strategies.

Institutional Review Board Statement

All procedures complied with institutional ethical standards and participants provided informed consent.

References

  1. Çelik, F., and M. H. Baturay. 2024. Technology and innovation in shaping the future of education. Smart Learning Environments 11, 1: 54. [Google Scholar] [CrossRef]
  2. Chen, L., P. Chen, and Z. Lin. 2020. Artificial Intelligence in Education: A Review. IEEE Access 8: 75264–75278. [Google Scholar] [CrossRef]
  3. Craft, A. 2003. Creativity Across the Primary Curriculum: Framing and Developing Practice. Routledge. [Google Scholar] [CrossRef]
  4. Davis, F. D. 1989. Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology. MIS Quarterly 13, 3: 319. [Google Scholar] [CrossRef]
  5. Innovation and Digital Policy Cyprus. 2020. National Strategy for Artificial Intelligence—Deputy Ministry of Research, Innovation and Digital Policy—Gov.cy. Available online: https://www.gov.cy/dmrid/en/documents/national-strategy-for-artificial-intelligence/.
  6. Digital, Decade. n.d.Cyprus 2025 Digital Decade Country Report Shaping Europe’s digital future. Available online: https://digital-strategy.ec.europa.eu/en/factpages/cyprus-2025-digital-decade-country-report.
  7. Ding, A.-C. E., L. Shi, H. Yang, and I. Choi. 2024. Enhancing teacher AI literacy and integration through different types of cases in teacher professional development. Computers and Education Open 6: 100178. [Google Scholar] [CrossRef]
  8. Elstad, E. 2024. AI in Education: Rationale, Principles, and Instructional Implications. arXiv arXiv:2412.12116. [Google Scholar] [CrossRef]
  9. Commision, European. 2023. Digital Education Action Plan (2021-2027)—European Education Area. November 23. Available online: https://education.ec.europa.eu/focus-topics/digital-education/action-plan.
  10. Feng, L., and S. Xue. 2023. Using the DigCompEdu Framework to Conceptualize Teachers’ Digital Literacy. Education Journal 12, 3: 103–108. [Google Scholar] [CrossRef]
  11. Freyer, N., H. Kempt, and L. Klöser. 2024. Easy-read and large language models: On the ethical dimensions of LLM-based text simplification. Ethics and Information Technology 26, 3: 50. [Google Scholar] [CrossRef]
  12. Gabbiadini, A., G. Paganin, and S. Simbula. 2023. Teaching after the pandemic: The role of technostress and organizational support on intentions to adopt remote teaching technologies. Acta Psychologica 236: 103936. [Google Scholar] [CrossRef] [PubMed]
  13. Gasaymeh, A. M. 2017. Faculty Members’ Concerns about Adopting a Learning Management System (LMS): A Developing Country Perspective. Eurasia Journal of Mathematics, Science and Technology Education 13, 11: 7527–7537. [Google Scholar] [CrossRef] [PubMed]
  14. Ghimire, A., and J. Edwards. 2024. From Guidelines to Governance: A Study of AI Policies in Education. arXiv arXiv:2403.15601. [Google Scholar] [CrossRef]
  15. Griva, A. 2024. The effect of using ai in education: An empirical study conducted in Greek private high schools. Available online: http://hdl.handle.net/11025/57643.
  16. Guidroz, T., D. Ardila, J. Li, A. Mansour, P. Jhun, N. Gonzalez, X. Ji, M. Sanchez, S. Kakarmath, M. M. Bellaiche, M. Á. Garrido, F. Ahmed, D. Choudhary, J. Hartford, C. Xu, H. J. S. Echeverria, Y. Wang, J. Shaffer, Eric, and Q. Duong. 2025. LLM-based Text Simplification and its Effect on User Comprehension and Cognitive Load. arXiv arXiv:2505.01980. [Google Scholar] [CrossRef]
  17. Guilbault, K. M., Y. Wang, and K. M. McCormick. 2025. Using ChatGPT in the Secondary Gifted Classroom for Personalized Learning and Mentoring. Gifted Child Today 48, 2: 93–103. [Google Scholar] [CrossRef]
  18. Haase, J., and P. H. P. Hanel. 2023. Artificial muses: Generative Artificial Intelligence Chatbots Have Risen to Human-Level Creativity. arXiv arXiv:2303.12003. [Google Scholar] [CrossRef]
  19. Hall, G. E., and S. M. Hord. 1987. Change in Schools: Facilitating the Process. SUNY Series in Educational Leadership. Publication Sales, SUNY Press, State University of New York-Albany, SUNY Plaza, Albany, NY. [Google Scholar]
  20. Harris, A., and M. Jones. 2019. Leading professional learning with impact. School Leadership & Management 39, 1: 1–4. [Google Scholar] [CrossRef]
  21. Holmes, W., M. Bialik, and C. Fadel. 2019. Artificial Intelligence In Education: Promises and Implications for Teaching and Learning. Center for Curriculum Redesign: Available online: http://udaeducation.com/wp-content/uploads/2019/05/Artificial-Intelligence-in-Education.-Promise-and-Implications-for-Teaching-and-Learning.pdf.
  22. Holmes, W., K. Porayska-Pomsta, K. Holstein, E. Sutherland, T. Baker, S. B. Shum, O. C. Santos, M. T. Rodrigo, M. Cukurova, I. I. Bittencourt, and K. R. Koedinger. 2022. Ethics of AI in Education: Towards a Community-Wide Framework. International Journal of Artificial Intelligence in Education 32, 3: 504–526. [Google Scholar] [CrossRef]
  23. Huang, L. 2023. Ethics of Artificial Intelligence in Education: Student Privacy and Data Protection. Science Insights Education Frontiers 16, 2: 2577–2587. [Google Scholar] [CrossRef]
  24. Ifenthaler, D., R. Majumdar, P. Gorissen, M. Judge, S. Mishra, J. Raffaghelli, and A. Shimada. 2024. Artificial Intelligence in Education: Implications for Policymakers, Researchers, and Practitioners. Technology, Knowledge and Learning 29, 4: 1693–1710. [Google Scholar] [CrossRef]
  25. Jack, E., C. Alexander, and E. M. Jones. 2024. Exploring the impact of gamification on engagement in a statistics classroom. arXiv arXiv:2402.18313. [Google Scholar] [CrossRef]
  26. Karousiou, C. 2025. Navigating challenges in school digital transformation: Insights from school leaders in the Republic of Cyprus. Educational Media International 62, 1: 54–76. [Google Scholar] [CrossRef]
  27. Kasinidou, M., S. Kleanthoys, and J. Otterbacher. 2025. Cypriot teachers’ digital skills and attitudes towards AI. Discover Education 4, 1: 1. [Google Scholar] [CrossRef]
  28. Kimmons, R., C. R. Graham, and R. E. West. 2020. The PICRAT Model for Technology Integration in Teacher Preparation. Contemporary Issues in Technology and Teacher Education (CITE Journal) 20, 1. [Google Scholar]
  29. Kolb, D. A. 2014. Experiential learning: Experience as the source of learning and development. FT press: Available online: https://books.google.com/books?hl=en&lr=&id=jpbeBQAAQBAJ&oi=fnd&pg=PR7&dq=info:cBzPyc88vIgJ:scholar.google.com&ots=Vp7PlU-XMa&sig=lHWEU-Wc4NLcwUb3ODR3urtrwyQ.
  30. Lu, O. H. T., A. Y. Q. Huang, D. C. L. Tsai, and S. J. H. Yang. 2021. Expert-Authored and Machine-Generated Short-Answer Questions for Assessing Students Learning Performance. Educational Technology & Society 24, 3: 159–173. [Google Scholar]
  31. Mayer, R. E. 2009. Multimedia learning, 2nd ed. Cambridge University Press. [Google Scholar] [CrossRef]
  32. Ministry of Education. 2024. Aνάρτηση επικαιροποιημένων Aναλυτικών Προγραμμάτων—Gov.cy. August 28. Available online: https://www.gov.cy/paideia-athlitismos-neolaia/anartisi-epikairopoiimenon-analytikon-programmaton/.
  33. Mogavi, R. H., B. Guo, Y. Zhang, E.-U. Haq, P. Hui, and X. Ma. 2022. When Gamification Spoils Your Learning: A Qualitative Case Study of Gamification Misuse in a Language-Learning App (No. arXiv arXiv:2203.16175. [Google Scholar] [CrossRef]
  34. Molino, M., E. Ingusci, F. Signore, A. Manuti, M. L. Giancaspro, V. Russo, M. Zito, and C. G. Cortese. 2020. Wellbeing Costs of Technology Use during Covid-19 Remote Working: An Investigation Using the Italian Translation of the Technostress Creators Scale. Sustainability 12, 15: 5911. [Google Scholar] [CrossRef]
  35. National Action Plan 2021. n.d.Digital Skills – National Action Plan 2021 – 2025—Deputy Ministry of Research, Innovation and Digital Policy—Gov.cy. Retrieved October 7, 2025. Available online: https://www.gov.cy/dmrid/en/documents/digital-skills-national-action-plan-2021-2025/.
  36. Ng, D. T. K., J. K. L. Leung, M. J. Su, I. H. Y. Yim, M. S. Qiao, and S. K. W. Chu. 2022. AI Literacy Education in Early Childhood Education. In AI Literacy in K-16 Classrooms. Edited by D. T. K. Ng, J. K. L. Leung, M. J. Su, I. H. Y. Yim, M. S. Qiao and S. K. W. Chu. Springer International Publishing: pp. 63–74. [Google Scholar] [CrossRef]
  37. Nguyen, T. N. N., T. T. Tran, N. H. A. Nguyen, H. P. Lam, H. M. S. Nguyen, and N. A. T. Tran. 2025. The Benefits and Challenges of AI Translation Tools in Translation Education at the Tertiary Level: A Systematic Review. International Journal of TESOL & Education 5, 2: 132–148. [Google Scholar] [CrossRef]
  38. Ning, Y., C. Zhang, B. Xu, Y. Zhou, and T. T. Wijaya. 2024. Teachers’ AI-TPACK: Exploring the Relationship between Knowledge Elements. Sustainability 16, 3: 1–23. [Google Scholar] [CrossRef]
  39. Paramasveran, R. a/l, and N. M. Nasri. 2018. Teachers’ Concerns on the Implementation and Practices of i-THINK with Concern Based Adoption Model (CBAM). Creative Education 9, 14: 2183–2191. [Google Scholar] [CrossRef]
  40. Perrotta, C., and N. Selwyn. 2020. Deep learning goes to school: Toward a relational understanding of AI in education. Learning, Media and Technology 45, 3: 251–269. [Google Scholar] [CrossRef]
  41. Pitts, G., V. Marcus, and S. Motamedi. 2025. Student Perspectives on the Benefits and Risks of AI in Education. arXiv arXiv:2505.02198. [Google Scholar] [CrossRef]
  42. Republic of Cyprus. 2024. Regulatory Administrative Act No. 168/2024. Official Gazette of the Republic. Available online: http://cylaw.org/KDP/2024.html.
  43. Republic of Cyprus Governance. 2025. Στρατηγική Προώθησης της Τεχνητής Νοημοσύνης στην Κύπρο. Republic of Cyprus Governance: Available online: https://www.diakivernisi.gov.cy/gr/epikairotita/stratigiki-prowthisis-tis-texnitis-noimosynis-stin-kypro.
  44. Rogers, E. M., A. Singhal, and M. M. Quinlan. 2008. Diffusion of Innovations. In An Integrated Approach to Communication Theory and Research, 2nd ed. Routledge. [Google Scholar]
  45. Schophuizen, M., and M. Kalz. 2020. Educational innovation projects in Dutch higher education: Bottom-up contextual coping to deal with organizational challenges. International Journal of Educational Technology in Higher Education 17, 1: 36. [Google Scholar] [CrossRef]
  46. Selwyn, N., L. Pangrazio, S. Nemorin, and C. Perrotta. 2020. What might the school of 2030 be like? An exercise in social science fiction. Learning, Media and Technology 45, 1: 90–106. [Google Scholar] [CrossRef]
  47. Tan, X., G. Cheng, and M. H. Ling. 2025. Artificial intelligence in teaching and teacher professional development: A systematic review. Computers and Education: Artificial Intelligence 8: 100355. [Google Scholar] [CrossRef]
  48. Tobin, B., M. Farren, and Y. Crotty. 2024. Impacting teaching and learning through collaborative reflective practice. Educational Action Research, 1–21. [Google Scholar] [CrossRef]
  49. Vuorikari, R., S. Kluzer, and Y. Punie. 2022. DigComp 2.2: The Digital Competence Framework for Citizens - With new examples of knowledge, skills and attitudes. In JRC Publications Repository. [Google Scholar] [CrossRef]
  50. Zagami, J. 2024. AI Chatbot Influences on Preservice Teachers’ Understanding of Student Diversity and Lesson Differentiation in Online Initial Teacher Education. International Journal on E-Learning 23, 4: 443–455. [Google Scholar] [CrossRef]
  51. Zawacki-Richter, O., V. I. Marín, M. Bond, and F. Gouverneur. 2019. Systematic review of research on artificial intelligence applications in higher education – where are the educators? International Journal of Educational Technology in Higher Education 16, 1: N.PAG. [Google Scholar] [CrossRef]
  52. Zha, S., Y. Qiao, Q. Hu, Z. Li, J. Gong, and Y. Xu. 2024. Designing Child-Centric AI Learning Environments: Insights from LLM-Enhanced Creative Project-Based Learning. arXiv arXiv:2403.16159. [Google Scholar] [CrossRef]
  53. Zhai, C., S. Wibowo, and L. D. Li. 2024. The effects of over-reliance on AI dialogue systems on students’ cognitive abilities: A systematic review. Smart Learning Environments 11, 1: 28. [Google Scholar] [CrossRef]
Table 1. Level of Engagement with AI tools.
Table 1. Level of Engagement with AI tools.
Type of Use Interview 1 (Participants) Interview 2 (Participants)
Non-Use “I haven’t used them myself so far” (Int1P2) “I’ve heard how you can use it, I just haven’t used it yet.” (Int1PP3) N/A
Exploratory Use “In general, ChatGPT and I are just getting to know each other.” (Int1P11)
“I experimented and I continue to experiment.” (Int1P21)
N/A
Application Oriented Use N/A “I’ll definitely be using them [...] during preparation [...]” (Int2P3)
“Okay, for lesson preparation, of course it’s useful.[...]” (Int2P17)
Potential Integration/Use N/A “Yes, I will continue with it. I want to learn more about the tools you recommended, to explore them myself first and find ideas to build on. [...]” (Int1P2)
“I’m considering it, yes. I think it’s a necessary evil. [...]” (Int2P4)
“I want to continue. If I have the same conditions as this year, I’ll definitely continue.” (Int2P5)
Systematic Use “Okay, we use ChatGPT quite often… also Magic School, Ideogram, and Muse.ai for creating images, DALL·E, and… I’m forgetting the name of another one that’s also for text generation” (Int1P4)
“Lately, I’ve started using ChatGPT more intensively.” (Int1P7)
“Yes, today we already had our first [lesson using Artificial Intelligence tools] with the little ones [students] […]” (Int1P10)
“Definitely, it’s done — they’ve now become part of my life. Although it scares me a little, in the sense that I now consider it… especially ChatGPT… I consider it essential. [...]” (Int2P7)
“Of course, I use them here for my PhD, academically. Not just for that… let alone for my everyday teaching practice.” (Int2P2)
“Yes, absolutely. As a person, as a teacher, and as a mom.” (Int2P24)
Note: Quotes are selected for clarity and representativeness. Some categories include more examples due to the depth of participant responses. N/A indicates “not applicable.”.
Table 2. Use of Artificial Intelligence Tools for Teaching Preparation.
Table 2. Use of Artificial Intelligence Tools for Teaching Preparation.
Subcategory Description of Use Indicative Examples
Creation and adaptation (differentiation) of teaching material Simplifying texts, creating differentiated content, and adapting materials to learning levels “Then I gave them the different texts I had created with the help of ChatGPT, which corresponded to the three learning levels.” (Int2P13)
Text simplification for language support Producing simpler or grade- appropriate texts with targeted vocabulary “And I can ask the AI to create a text that would fit, let’s say, fifth grade or first or second grade, and that includes a lot of adjectives.” (Int1P4)
Creation of worksheets, quizzes, and assessments Rapid creation of worksheets and questions using AI-based tools “…I need a worksheet, there’s no reason to waste time. I go to ChatGPT or Magic School, input my text, and immediately get a worksheet.”
(Int2P9)
Interactive media and gamified activities Use of game-based platforms (e.g., Quizizz, Kahoot) to promote engagement “Some game-like questions were used for text comprehension through Kahoot. Not Kahoot, Quizlet.”
(Int2P13)
Creation of assessments using AI Using AI to design differentiated or adaptive assessment tools “This is the third attempt, the third artificial intelligence tool. It’s Quizziz AI. I chose it as a third, different option aimed at creating assessment quizzes.” (TAP3P15)
Creation of multimodal and visual material to support teaching Generation of videos, images, or presentations to enhance understanding and engagement “I once made a little video for a lesson, and the kids really liked it.” (Int2P19)
Support in lesson planning AI suggested lesson ideas, full plans, and activities tailored to student needs “…it gives me ideas, and then I build on those ideas, I start building something else, and then a new idea comes to me…”
(Int2P7)
Note: Subcategories reflect themes identified through qualitative coding of participant interviews and TAPs.
Table 3. AI-Supported Student Creativity: Group and Individual Practices.
Table 3. AI-Supported Student Creativity: Group and Individual Practices.
Subcategory Examples
Group Creation and Reflection “What I tried with the kids, and it worked, let’s say, was this: how I would say it and how Artificial Intelligence would say it. We essentially compared our own text with the one generated by the AI.” (Int1P4)
“We had [Gemini] make predictions, I think, if I remember correctly. We gave it, let’s say, some words to generate predictions about the text, and then we compared them with the students’.” (Int1P7)
“We do this with the students, you enter a description, and it gives you an image, quite a specific one. Or for text, you can input a paragraph, I mean a paragraph of text, and it generates images for you.” (Int2P4)
“We also made a song about the [March 25th] heroes. Using Suno.” (Int2P10)
“…I used [ChatGPT] first in the whole class so they could see that I type to it, describe it, and then, when I click for it to generate the image, before we went down to the lab, and then I would give it corrections, ask it to redo it, and then we went down to the lab.” (Int2P14)
Individual Use and Personalized Support “I said I had used Canva last year to create a poster in Math that the students wanted to make about grouping problems.” (Int1P10)
“I also used it based on the children’s descriptions to create something. [...] so a video was created that we shared with the parents and played in class.” (Int1P21)
“…a lot with images, because we were doing compositions, and based on the descriptions the students gave, it generated their little image for them to include under their writing in their notebooks.” (Int2P2)
“...we did an activity with the younger students about coins, they had to write a description using Magic Media, to write the description.” (Int2P7)
Note: Subcategories reflect patterns identified through thematic coding of participant interviews. The table illustrates how educators integrated AI to support creative expression in both collaborative and individualized learning contexts.
Table 9. Ethical and Pedagogical Concerns of Educators.
Table 9. Ethical and Pedagogical Concerns of Educators.
Subcategory Examples
Ethical concerns about the protection of personal data and student safety “Yes, or if we’re going to use photos and things that will be in the classroom again. I think, I’m not sure, are we covered in all aspects?” (Int2P3)
“[...] it needs to be used within an environment that is safe for the children. There should be something that is controlled at the school level and appropriate for the age of the students we have.” (Int2P5)
Fear of teacher replacement / devaluation of learning “Doubts, okay, we all generally have doubts about artificial intelligence, about how far it’s going to go, how much we can... it serves us now, but you also think about how much AI can serve us, I mean, more in terms of ethics. And when you have children, you think about that aspect too.”(Int2P4)
“[...] what really concerns me about the danger related to Artificial Intelligence is that we might reach a point where we no longer create anything ourselves, and everything will be ready-made, and we’ll start to consider the AI-generated product aesthetically pleasing? That’s something that worries and scares me.” (Int2P9)
Concerns about the authenticity of learning and critical thinking “You need to be careful to check that what it tells you is actually correct, you have to test it first. It shouldn’t be like, I typed it in, it gave me something, and that’s it.” (Int2P7)
“As for involving students in this area, okay, that’s where it requires a more careful consideration of your approach, a good understanding of your students’ abilities, what you’ve taught them so far, what their capabilities are. And I need to think carefully about where my role as a teacher ends and where this new kind of work begins.” (Int2P10)
Parental reactions and cultural/religious sensitivities “Also, the Ministry needs to have convinced the parents that they should accept these tools, so that we don’t have to deal with parental backlash.” (Int2P11)
“…It’s not just the kids, it’s also the parents. Because as a teacher, you have a lot to deal with if you fall into a trap, let’s say.[...] There are parents who don’t want this happening. Some are very religious, or… There are people who genuinely don’t want their children getting too involved in all this.” (Int2P19)
Equity of opportunities between schools “From the start, it needs to ensure the strengthening of these networks, labs, tablets, there needs to be, in any case, some equality among schools when it comes to this technological matter.” (Int2P2)
“The Ministry needs to manage this technological equipment issue... Not all schools have someone who’s involved or knowledgeable about it.” (Int2P4)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

Disclaimer

Terms of Use

Privacy Policy

Privacy Settings

© 2025 MDPI (Basel, Switzerland) unless otherwise stated