Submitted:
14 October 2025
Posted:
15 October 2025
You are already at the latest version
Abstract
Keywords:
1. Introduction
- FACS (Facial Action Coding System) for scientifically validated decoding of subtle facial cues;
- Artificial intelligence (AI) for automated analysis and adaptive, real-time feedback;
- Virtual reality (VR) for immersive, controlled environments in which teachers can safely experiment with emotionally complex classroom situations.
- the ability-based model of emotional intelligence (Mayer et al., 2016);
- design-based research (DBR) methodologies (McKenney & Reeves, 2012);
- simulation-based teacher-training environments such as SimSchool (Gibson et al., 2007; Clift & Choi, 2020).
| Studio | FACS decoding | AI real-time feedback | VR immersion | Pedagogical integration | Ethical safeguards |
| Chiu et al. (2023) | ✘ | ✔ | ✘ | ✘ | ✘ |
| Rodríguez-Andrés et al. (2022) | ✘ | ✘ | ✔ | Partial | ✘ |
| Makransky & Mayer (2022) | ✘ | ✘ | ✔ | ✔ | ✘ |
| Zhang et al. (2023) | ✘ | ✔ | ✔ | Partial | ✘ |
| E-MOTE | ✔ | ✔ | ✔ | ✔ | ✔ |
2. FACS as a Tool for Decoding Emotional Micro-Expressions in Educational Contexts.
| AU code & name | Muscle movement | Inferred emotional state | Potential pedagogical significance & response |
| AU1: inner brow raiser | Eyebrows raised inward | Worry, sadness, concentration | May indicate a student is struggling with a concept. A teacher could respond with a clarifying question, offer encouragement, or check for understanding |
| AU4: brow lowerer | Eyebrows lowered/drawn together | Frustration, anger, intense focus | Signals rising frustration or cognitive load. This is a key moment to intervene with scaffolding, a short break, or by re-framing the task to prevent disengagement. |
| AU5: upper lid raiser | Upper eyelid raised, eye widening | Upper eyelid raised, eye widening | Fear, surprise, vigilance |
- Sadness often involves AU1, AU4, and AU15 (lip corner depressor), signalling a need for empathetic connection or support.
- Genuine happiness or engagement is reflected in the concurrent activation of AU6 (cheek raiser) and AU12 (lip corner puller), the "Duchenne smile," which can serve as positive feedback for a teacher's instructional approach.
- Boredom or disengagement is often signaled by AU17 (chin raiser) and AU25 (lips part), indicating a need to increase the lesson's pace, interactivity, or relevance.
3. Integrating AI with VR into Teacher Training
3.1. Operational Workflow of the AI-FACS-VR Module
3.2. Advanced Use Cases of the AI-FACS–VR Integration
- Practicing inclusive classroom management: VR enables the creation of highly realistic simulations that replicate the complexities of diverse classrooms. These immersive environments foster reflective awareness by allowing teachers to observe their own behavioral responses in emotionally charged situations.
- o Scenarios may include mediating interpersonal conflicts or de-escalating a frustrated student (practicing responses to sustained AU4 and AU7). Teachers can interact with avatars representing students with emotional regulation difficulties, allowing them to practice co-regulation strategies in a safe space.
- o Pedagogical transfer: the immediate feedback on AUs helps teachers calibrate their ability to detect early signs of disengagement (e.g., AU17) or anxiety (AU1+AU20), enabling them to make timely instructional decisions like reframing a task or offering validation before a student fully disengages.
- Developing culturally responsive cue recognition: simulations can be designed to model culturally specific nuances in emotional expression, training teachers to avoid misinterpretations. The AI feedback can be calibrated to different cultural datasets, making the teacher aware that the intensity or meaning of an AU configuration (like AU12 for a smile) can vary.
- Formative assessment of teacher noticing: the system provides aggregated, objective data on a teacher's "noticing" skills. For example, a post-session report could reveal that a teacher consistently missed subtle cues of confusion (AU1) in a particular student avatar, indicating a specific area for growth in their perceptual acuity.
4. Core Professional Competencies and Ethical Foundations
- Emotional attunement: the ability to accurately recognize students’ facial cues, including micro-expressions, with the support of AI-based detection systems, and to integrate this information to adapt communicative tone and teaching strategies in real time. This aligns with the ability model of emotional intelligence, which defines EI as a set of cognitive-emotional skills including perception, understanding, and regulation of emotions (Mayer et al., 2008; 2016).
- Empathic responsiveness: practiced through immersive VR scenarios that allow teachers to simulate context-sensitive reactions and emotionally supportive behaviors, a process shown to enhance learning outcomes through emotionally engaging and cognitively structured experiences (Makransky & Petersen, 2021a; Parong & Mayer, 2018), and supported by evidence on the role of presence and immersion in fostering psychological engagement in VR environments (Slater, 2018).
- Reflective growth: fostered through self-assessment dashboards and post-simulation analytics that translate micro-expression data into personalized development goals, in alignment with a vision of teacher learning as inquiry-based and situated within communities of practice (Cochran-Smith & Lytle, 1999).
- Ethical inclusion: grounded in Nussbaum's (2010) capabilities approach, which emphasizes justice, empathy, and respect for the dignity and potential of every learner.
5. Projected Pedagogical Impacts and Classroom Implications
6. Application Prospects and Systemic Integration
- Initial teacher training: universities could integrate E-MOTE modules to provide foundational training in emotional competence, combining theoretical FACS instruction with immersive VR simulations to build perceptual skills from the outset of a teacher's career.
- Ongoing professional development: for in-service teachers, E-MOTE could offer advanced modules focused on specific challenges, such as managing inclusive classrooms or recognizing cross-cultural emotional expressions, using the high-fidelity simulations for deliberate practice.
- Tool for reflective inquiry: the post-simulation dashboards provide objective data on a teacher's "noticing," making it a powerful tool for self-assessment and coaching within professional learning communities.
7. Limitations and Future Research Agenda
- Objective: to assess the accuracy and reliability of the AI-FACS-VR module in a controlled laboratory setting and establish its basic usability.
- Research priorities:
- 1. Algorithmic accuracy: evaluate the performance of the FACS-decoding AI when applied to the facial animations of virtual student avatars, ensuring it can accurately identify pedagogically relevant AUs in real-time.
- 2. User experience (UX) and usability: conduct studies with pre-service teachers to assess the usability of the VR interface and the perceived usefulness of the feedback mechanisms (e.g., in-VR cues, post-simulation dashboard) using think-aloud protocols and standardized UX scales.
- 3. Mitigating bias: initiate the development of culturally diverse, FACS-annotated datasets of virtual expressions to audit and mitigate algorithmic bias, ensuring the system does not misclassify emotions based on avatar ethnicity or gender (Mehrabi et al., 2021).
- Objective: to evaluate the framework's impact on teacher learning and competency development in controlled training contexts.
- Research priorities:
- 1. Impact on noticing skills: employ experimental designs with pre- and post-assessments to determine if training with E-MOTE leads to significant improvements in teachers' accuracy and speed in detecting micro-expressions in standardized video clips of students.
- 2. Perceived utility and ethical concerns: explore teachers' perceptions of the tool's pedagogical integration, ethical implications, and alignment with professional values through mixed-methods studies (surveys, focus groups).
- 3. Cross-cultural calibration: initiate cross-cultural validation by testing the framework across diverse educational contexts, examining how cultural norms influence the interpretation of AUs and adapting feedback mechanisms accordingly, in line with the OECD PISA Global Competence Framework (Organisation for Economic Co-operation and Development, 2019).
- Objective: to assess the transfer of trained skills to real classrooms and the long-term impact on teaching practice and student outcomes.
- Research priorities:
- 1. Transfer to practice: conduct longitudinal studies that track teachers from VR training into their classrooms, using observational methods to see if improved perceptual acuity in simulation leads to more responsive and inclusive teaching behaviors in practice.
- 2. Impact on classroom climate and student outcomes: investigate the downstream effects of E-MOTE training by measuring its correlation with improved classroom climate, student engagement, and student self-reports of well-being.
- Data privacy and governance: all phases must implement and refine the proposed ethical safeguards (granular consent, data minimization, pseudonymization) and subject them to independent ethical review.
- Teacher readiness and technostress: research must investigate the optimal support structures, such as structured professional development and technical coaching, needed to foster pedagogical fluency and reduce technostress.
- Health and safety: best practices for VR use (e.g., session limits, structured debriefings) must be empirically tested and integrated into the protocol to minimize simulator sickness and visual fatigue (Rebenitsch & Owen, 2016).
8. Conclusions
References
- Adadi, A.; Berrada, M. Peeking inside the black-box: a survey on explainable artificial intelligence (XAI). IEEE Access 2018, 6, 52138–52160. [Google Scholar] [CrossRef]
- Adolphs, R. Recognizing emotion from facial expressions: psychological and neurological mechanisms. Behavioral and Cognitive Neuroscience Reviews 2002, 1((1)), 21–62. [Google Scholar] [CrossRef]
- Adolphs, R. The social brain: neural basis of social behavior. Annual Review of Psychology 2009, 60, 693–716. [Google Scholar] [CrossRef]
- Alfredo, R.; Echeverria, V.; Jin, Y.; Yan, L.; Swiecki, Z.; Gašević, D.; Martinez-Maldonado, R. Human-Centred Learning Analytics and AI in Education: a Systematic Literature Review . Computers & Education Artificial Intelligence 2024, 6(5), Article 100215. [Google Scholar]
- Azukas, M. E.; Kluk, D. Simulated teaching: an exploration of virtual classroom simulation for preservice teachers during the COVID-19 pandemic. In Exploring online learning through simulations and gaming; Graham, C. R., Krutka, D., Kimmons, S., Eds.; Springer, 2022; pp. 97–112. [Google Scholar]
- Badiee, F.; Kaufman, D. Design evaluation of a simulation for teacher education. SAGE Open 2015, 5(2), 2158244015592454. [Google Scholar] [CrossRef]
- Banaji, M. R.; Greenwald, A. G. Blindspot: hidden biases of good people; Delacorte Press, 2013. [Google Scholar]
- Billingsley, G.; Smith, S.; Smith, S.; Meritt, J. A systematic review of immersive virtual reality in teacher education: current applications and future directions. Journal of Research on Technology in Education 2023a, 55(1), 106–128. [Google Scholar]
- Billingsley, B.; Bertram, C.; Nassaji, M. How should AI be taught in schools? Ethical and pedagogical issues to consider. Frontiers in Education 2023b, 8, Article 1145665. [Google Scholar]
- Blakemore, S. J.; Frith, U. The learning brain: lessons for education; Blackwell Publishing, 2005. [Google Scholar]
- Brackett, M. A.; Rivers, S. E.; Reyes, M. R.; Salovey, P. Enhancing academic performance and social and emotional competence with the RULER feeling words curriculum. Learning and Individual Differences 2012, 22(2), 218–224. [Google Scholar] [CrossRef]
- Chen, C.; Jack, R. E. Discovering cultural differences through emotion perception, learning, and reverse correlation. Current Opinion in Psychology 17 2017, 44–48. [Google Scholar]
- Chernikova, O.; Heitzmann, N.; Stadler, M.; Holzberger, D.; Seidel, T.; Fischer, F. Simulation-based learning in higher education: a meta-analysis. Review of Educational Research 90(4) 2020, 499–541. [Google Scholar] [CrossRef]
- Chiu, T. K. F.; Lin, T.-J.; Lonka, K. A systematic review of research on artificial intelligence applications in higher education: learning effectiveness and future directions. Computers and Education: Artificial Intelligence 2023a, 4, 100142. [Google Scholar]
- Chiu, T. K. F.; Hew, K. F.; Ng, C. S. L. A systematic review of artificial intelligence applications in K–12 education: Learning outcomes, learner characteristics, and pedagogical strategies. British Journal of Educational Technology 2023b, 54(2), 357–378. [Google Scholar]
- Chiu, T. K. F.; Lin, T.-J.; Chai, C. S.; Pak, R.; Zhan, Y. Artificial intelligence in education: a systematic review and future research directions. Computers and Education: Artificial Intelligence 4 2023c, 100108. [Google Scholar]
- Christensen, R.; Knezek, G.; Tyler-Wood, T.; Gibson, D. SimSchool: an online dynamic simulator for enhancing teacher preparation. Journal of Technology and Teacher Education 2011, 19(3), 277–292. [Google Scholar] [CrossRef]
- Clift, R. T.; Choi, S. The use of simulation in teacher education: A review of SimSchool and beyond . Teaching and Teacher Education 2020, 91, 103037. [Google Scholar]
- Cochran-Smith, M.; Lytle, S. L. Relationships of knowledge and practice: teacher learning in communities. Review of Research in Education 1999, 24(1), 249–305. [Google Scholar]
- Cohn, J. F.; Ekman, P. Observer based measurement of facial expression with the Facial Action Coding System. In Handbook of Emotion Elicitation and Assessment; Coan, J. A., Allen, J. B., Eds.; Oxford University Press , 2005; pp. 111–134. [Google Scholar]
- Cook-Sather, A. Sound, presence, and power: "student voice" in educational research and reform. Curriculum Inquiry 2006, 36(4), 359–390. [Google Scholar] [CrossRef]
- Damasio, A. R. Descartes' error: emotion, reason, and the human brain; G. P. Putnam’s Sons, 1994. [Google Scholar]
- Deale, D. F.; Pastore, R. S. Evaluation of SimSchool: an instructional simulation for preservice teachers. Journal of Educational Technology Systems 2014, 42(3), 255–268. [Google Scholar]
- Dimitropoulos, K.; Manitsaris, S.; Tsalakanidou, F. Capturing and analyzing affective features for teacher training on classroom management. IEEE Transactions on Affective Computing 2021, 12(3), 790–802. [Google Scholar]
- D’Mello, S.; Graesser, A. Autotutor and affective autotutor: learning by talking with cognitively and emotionally intelligent computers that talk back. ACM transactions on interactive intelligent systems (TiiS) 2012, 2(4), 1–39. [Google Scholar] [CrossRef]
- D’Mello, S.; Kory, J. A review and meta-analysis of multimodal affect detection systems. ACM Computing Surveys 2015, 47(3), 1–36. [Google Scholar] [CrossRef]
- D’Mello, S. K.; Dieterle, E.; Duckworth, A. Advanced, analytic, and automated: Data science and the future of learning assessment. Journal of Educational Psychology 2017, 109(7), 1010–1025. [Google Scholar]
- Durlak, J. A.; Weissberg, R. P.; Dymnicki, A. B.; Taylor, R. D.; Schellinger, K. B. The impact of enhancing students’ social and emotional learning: A meta-analysis of schoolbased universal interventions. Child Development 2011, 82(1), 405–432. [Google Scholar] [CrossRef] [PubMed]
- Ekman, P.; Friesen, W. V. Cultural differences in facial expression of emotion. In Nebraska symposium on motivation; Cole, J., Ed.; University of Nebraska Press, 1972; Vol. 19, pp. 207–283. [Google Scholar]
- Ekman, P. An argument for basic emotions. Cognition & Emotion 1992, 6(3–4), 169–200. [Google Scholar]
- Ekman, P. Emotions revealed: recognizing faces and feelings to improve communication and emotional life. In Times Books; 2003. [Google Scholar]
- European Parliament and Council of the European Union. Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 (General Data Protection Regulation). Official Journal of the European Union L119 2016, 1–88. [Google Scholar]
- European Commission. Ethical guidelines on the use of artificial intelligence and data in teaching and learning for educators; Publications Office of the European Union, 2022. [Google Scholar]
- Games and simulations in online learning: Research and development frameworks; Gibson, D., Aldrich, C., Prensky, M., Eds.; IGI Global: Hershey, PA, 2007. [Google Scholar]
- Gibson, D. SimSchool: An online dynamic simulator for enhancing teacher preparation. International Journal of Learning Technology 2011, 6(2), 201–220. [Google Scholar] [CrossRef]
- Gross, J. J. Emotion regulation: current status and future prospects. Psychological Inquiry 2015, 26(1), 1–26. [Google Scholar] [CrossRef]
- Halberstadt, A. G.; Cooke, A. N.; Garner, P. W.; Hughes, S. A.; Oertwig, D.; Neupert, S. D. Racialized emotion recognition accuracy and anger bias of children’s faces. Emotion 2022, 22(3), 403–417. [Google Scholar] [CrossRef]
- Hargreaves, A. The emotional practice of teaching. Teaching and Teacher Education 2000, 16(8), 811–826. [Google Scholar] [CrossRef]
- Holmes, W.; Bialik, M.; Fadel, C. Artificial Intelligence in education: promises and implications for teaching and learning. In Center for Curriculum Redesign, 1st ed.; 2019. [Google Scholar]
- Hopper, S. B. Developing teacher know-how through play in SimSchool. International Journal of Teaching and Learning in Higher Education 2018, 30(1), 46–56. [Google Scholar]
- Howard-Jones, P. A. Neuroscience and education: myths and messages. Nature Reviews Neuroscience 2014, 15(12), 817–824. [Google Scholar] [CrossRef]
- Huang, Y.; Li, H.; Fong, R. The application of artificial intelligence in virtual reality learning environments: a systematic review. Interactive Learning Environments 2021, 29(6), 1038–1057. [Google Scholar]
- Immordino-Yang, M. H.; Yang, X.F.; Damasio, H. Cultural modes of expressing emotions influence how emotions are experienced. Emotion 2016a, 16(7), 1033–1039. [Google Scholar] [CrossRef]
- Immordino-Yang, M. H. Emotion, sociality, and the brain’s default mode network: insights for educational practice and policy. Policy insights from the behavioral and brain sciences 2016b, 3(2), 211–219. [Google Scholar] [CrossRef]
- Jack, R. E.; Garrod, O. G. B.; Yu, H.; Caldara, R.; Schyns, P. G. Facial expressions of emotion are not culturally universal. Proceedings of the National Academy of Sciences 2012a, 109(19), 7241–7244. [Google Scholar] [CrossRef]
- Jack, R. E.; Caldara, R.; Schyns, P. G. Internal representations reveal cultural diversity in expectations of facial expressions of emotion. Journal of Experimental Psychology: general 141(1) 2012b, 19–25. [Google Scholar] [CrossRef]
- Jennings, P. A.; Greenberg, M. T. The prosocial classroom: teacher social and emotional competence in relation to student and classroom outcomes. Review of Educational Research 2009, 79(1), 491–525. [Google Scholar] [CrossRef]
- Kröger, J. L.; Meißner, F.; Rannenberg, K. Granular privacy policies with privacy scorecards: transparency of privacy practices through privacy icons. 2019 IEEE Security and Privacy Workshops (SPW); 2019; pp. 62–69. [Google Scholar]
- Liu, Y.; Wang, H. Teachers’ emotional perception and regulation in classroom interactions: Implications for socio-emotional teacher education. Teaching and Teacher Education 2023, 125, 104021. [Google Scholar]
- Livingstone, M. Vision and art: the biology of seeing; Harry N. Abrams, 2002. [Google Scholar]
- Lucey, P.; Cohn, J. F.; Kanade, T.; Saragih, J.; Ambadar, Z.; Matthews, I. The extended Cohn-Kanade dataset (CK+): A complete dataset for action unit and emotion-specified expression. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops; 2010; pp. 94–101. [Google Scholar]
- Luckin, R.; Holmes, W.; Griffiths, M.; Forcier, L.B.R. Intelligence unleashed: an argument for AI in education; Pearson Education, 2016. [Google Scholar]
- Luckin, R. Machine learning and human intelligence: the future of education for the 21st century; UCL Press, 2018. [Google Scholar]
- Mayer, J. D.; Salovey, P.; Caruso, D. R. Emotional intelligence: new ability or eclectic traits? American Psychologist 2008, 63(6), 503–517. [Google Scholar] [CrossRef]
- Mayer, J. D.; Salovey, P.; Caruso, D. R. The ability model of emotional intelligence: principles and updates. Emotion Review 2016, 8(4), 290–300. [Google Scholar] [CrossRef]
- Makransky, G.; Petersen, G. B. The cognitive affective model of immersive learning (CAMIL): a theoretical framework for learning in virtual reality. Educational Psychology Review 2021a, 33, 937–958. [Google Scholar] [CrossRef]
- Makransky, G.; Petersen, G. B. Investigating the process of learning with simulation-based virtual reality: a structural equation modeling approach. Computers & Education 2021b, 166, 104154. [Google Scholar]
- Makransky, G.; Mayer, R. E. Benefits of taking a virtual field trip in immersive virtual reality: Evidence for the immersion principle in multimedia learning. Educational Psychology Review 34 2022, 1771–1798. [Google Scholar] [CrossRef] [PubMed]
- Matsumoto, D.; Ekman, P. American-japanese cultural differences in intensity ratings of facial expressions of emotion. Motivation and Emotion 1989, 13(2), 143–157. [Google Scholar] [CrossRef]
- Matsumoto, D.; Hwang, H. C. Culture and nonverbal behavior. In The Sage handbook of nonverbal communication; Manusov, V., Patterson, M. L., Eds.; Sage Publications, 2006; pp. 219–236. [Google Scholar]
- Matsumoto, D.; Hwang, H. S. Evidence for training the ability to read microexpressions of emotion. Motivation and Emotion 2011, 35(2), 181–191. [Google Scholar] [CrossRef]
- McKenney, S.; Reeves, T. C. Conducting educational design research; Routledge, 2012. [Google Scholar]
- Mehrabi, N.; Morstatter, F.; Saxena, N.; Lerman, K.; Galstyan, A. A survey on bias and fairness in machine learning. ACM Computing Surveys 2021, 54(6), Article 1–35. [Google Scholar] [CrossRef]
- Ntoutsi, E.; Fafalios, P.; Gadiraju, U.; Iosifidis, V.; Nejdl, W.; Vidal, M.-E.; Ruggieri, S.; Turini, F.; Papadopoulos, S.; Krasanakis, E.; Buchmann, E.; Monreale, A.; Pensa, R. G.; Dragoni, M.; Hitzler, P. Bias in data-driven artificial intelligence systems—An introductory survey. Wiley Interdisciplinary Reviews: data mining and knowledge discovery 2020, 10(3), e1356. [Google Scholar] [CrossRef]
- Nussbaum, M. C. Creating capabilities: the human development approach; Harvard University Press, 2010. [Google Scholar]
- Organisation for Economic Co-operation and Development. OECD PISA global competence framework; OECD Publishing, 2019. [Google Scholar]
- Pantic, M.; Rothkrantz, L. J. M. Automatic analysis of facial expressions: The state of the art. IEEE Transactions on Pattern Analysis and Machine Intelligence 2000, 22(12), 1424–1445. [Google Scholar] [CrossRef]
- Parong, J.; Mayer, R. E. Learning science in immersive virtual reality: effects of prior knowledge and learning activity. Journal of Educational Psychology 2018, 110(6), 785–797. [Google Scholar] [CrossRef]
- Picard, R. W. Affective Computing; MIT Press, 1997. [Google Scholar]
- Politou, E.; Alepis, E.; Patsakis, C. Forgetting personal data and revoking consent under the GDPR: challenges and proposed solutions. Journal of Cybersecurity 2018, 4(1), tyy001. [Google Scholar] [CrossRef]
- Porter, S.; Ten Brinke, L. Secrets and lies: involuntary leakage in deceptive facial expressions as a function of emotional intensity. Journal of Nonverbal Behavior 2018, 42(1), 35–56. [Google Scholar] [CrossRef]
- Rebenitsch, L.; Owen, C. Review on cybersickness in applications and visual displays. Virtual Reality 2016, 20, 101–125. [Google Scholar] [CrossRef]
- Rodriguez-Andrés, D.; Juan, M. C.; Mollá, R.; Méndez-López, M. Virtual reality systems as tools for teacher training on emotional competence: a systematic review. Education and Information Technologies 2022, 27(4), 5053–5082. [Google Scholar]
- Siegel, D. J. The developing mind: how relationships and the brain interact to shape who we are, 3rd ed.; Guilford Press, 2020. [Google Scholar]
- Slater, M. Immersion and the illusion of presence in virtual reality. British Journal of Psychology 2018, 109(3), 431–433. [Google Scholar] [CrossRef] [PubMed]
- Soranzo, A.; Newberry, M. The uncatchable smile in Leonardo da Vinci’s La Bella Principessa portrait. Vision Research 113 2015, 78–86. [Google Scholar] [CrossRef]
- Soranzo, A.; Newberry, M. Investigating the “Uncatchable Smile” in Leonardo da Vinci’s La Bella Principessa: a comparison with the Mona Lisa and Pollaiuolo’s Portrait of a Girl. Journal of Visualized Experiments (JoVE) 2016, (116), e54248. [Google Scholar]
- Soranzo, A. Another ambiguous expression by Leonardo da Vinci. Gestalt Theory 2022, 44 (1-2), 41–60. [Google Scholar] [CrossRef]
- Soranzo, A. The psychology of Mona Lisa’s smile. Scientific Reports 2024, 14(1), 12250. [Google Scholar] [CrossRef]
- Sutton, R. E.; Wheatley, K. F. Teachers' emotions and teaching: a review of the literature and directions for future research. Educational Psychology Review 2003, 15(4), 327–358. [Google Scholar] [CrossRef]
- Tian, Y.; Kanade, T.; Cohn, J. F. Facial expression recognition. In Handbook of face recognition; Li, S. Z., Jain, A. K., Eds.; Springer, 2011; pp. 487–519. [Google Scholar]
- Ting-Toomey, S. Communicating across cultures; Guilford Press, 1999. [Google Scholar]
- United Nations Educational, Scientific and Cultural Organization. Recommendation on the ethics of artificial intelligence; UNESCO Publishing, 2023. [Google Scholar]
- Urhahne, D.; Zhu, M. Accuracy of teachers’ judgments of students’ subjective well-being . Learning and Individual Differences 2015, 43, 226–232. [Google Scholar] [CrossRef]
- Valbusa, F.; Tagliabue, L.; Vergani, L. Fostering social-emotional learning in higher education: The impact of a mindfulness-based training program on teacher's well-being and classroom climate. Frontiers in Psychology 13 2022, 928341. [Google Scholar]
- Venetz, M.; Zurbriggen, C.; Schwab, S. What do teachers think about their students’ inclusion? Consistency of students’ self-reports and teacher ratings . Frontiers in Psychology 2019, 10, 1637. [Google Scholar] [CrossRef]
- Wagemans, J.; Elder, J. H.; Kubovy, M.; Palmer, S. E.; Peterson, M. A.; Singh, M.; Von Der Heydt, R. A century of Gestalt psychology in visual perception: I. Perceptual grouping and figure–ground organization. Psychological Bulletin 2012, 138(6), 1172–1217. [Google Scholar] [CrossRef]
- Zhang, Z.; Fort, J. M.; Giménez-Mateu, L. Facial expression recognition in virtual reality environments: Challenges and opportunities. Frontiers in Psychology 14 2023, 1280136. [Google Scholar] [CrossRef]
- Zawacki-Richter, O.; Marín, V. I.; Bond, M.; Gouverneur, F. Systematic review of research on artificial intelligence applications in higher education – Where are the educators? International Journal of Educational Technology in higher education 2019, 16(1), Article 39. [Google Scholar] [CrossRef]
- Zeki, S. Inner vision: an exploration of art and the brain; Oxford University Press, 1999. [Google Scholar]
- Zurbriggen, C. L. A.; Nusser, L.; Krischler, M.; Schmitt, M. Teachers’ judgment accuracy of students’ subjective well-being in school: In search of explanatory factors . Teaching and Teacher Education 122 2023, 104304. [Google Scholar] [CrossRef]
| Dimension | SimSchool | E-MOTE |
|
Primary training focus |
Classroom management and instructional decision-making using AI-driven learner profiles | Perceptual acuity and emotional responsiveness through validated micro-expression decoding, immersive VR, and AI-driven adaptive feedback |
| Underlying approach | Relies on scripted learner behaviors and pre-programmed scenarios, not real affective data | Data-driven emotional analytics using FACS to decode authentic, non-verbal cues in real-time, combined with competency-based training pathways |
| Feedback mechanism | Primarily post-simulation analysis of teacher actions on student macro-behaviors (e.g., "engagement" score) | Real-time, granular feedback on perceptual performance (e.g., "You missed a cue of frustration (AU4) in Student A"), provided both in-VR and via post-simulation analytics |
| Technological components | AI-driven behavioral simulation of students | Unified ecosystem: FACS, AI analytics, VR immersion, multimodal feedback, and culturally responsive calibration |
| Scope of innovation | A mature platform for simulating classroom dynamics and pedagogical strategies | Advances the field by targeting the foundational skill of emotional perception, uniting validated micro-expression analysis, immersive practice, and adaptive feedback in a single training ecosystem designed for scalability |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).