Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Prediction of Students' Adaptability Using Explainable AI in Educational Machine Learning Models

Version 1 : Received: 14 May 2024 / Approved: 14 May 2024 / Online: 14 May 2024 (10:00:45 CEST)

How to cite: Nnadi, L. C.; Watanobe, Y.; Rahman, M. M.; John-Otumu, A. M. Prediction of Students' Adaptability Using Explainable AI in Educational Machine Learning Models. Preprints 2024, 2024050933. https://doi.org/10.20944/preprints202405.0933.v1 Nnadi, L. C.; Watanobe, Y.; Rahman, M. M.; John-Otumu, A. M. Prediction of Students' Adaptability Using Explainable AI in Educational Machine Learning Models. Preprints 2024, 2024050933. https://doi.org/10.20944/preprints202405.0933.v1

Abstract

As the educational landscape evolves, understanding and fostering student adaptability has become increasingly critical. This study presents a comparative analysis of (XAI) techniques to interpret machine learning models aimed at classifying student adaptability levels. Leveraging a robust dataset, we employed several machine learning algorithms with a particular focus on Random Forest, which demonstrated a 91% accuracy. Our study utilizes (SHAP), (LIME), Anchors, (ALE), and counterfactual explanations to reveal the specific contributions of various features impacting adaptability predictions. Consistently, 'Class Duration' and 'Financial Condition' emerge as key factors, while the study also underscores the subtle effects of 'Institution Type' and 'Load-shedding'. This multi-faceted interpretability approach bridges the gap between machine learning performance and educational relevance, presenting a model that not only predicts but also explains the dynamic factors influencing student adaptability. The synthesized insights advocate for educational policies accommodating socioeconomic factors, instructional time, and infrastructure stability to enhance student adaptability. The implications extend to informed and personalized educational interventions, fostering an adaptable learning environment. This methodical research contributes to responsible AI application in education, promoting predictive and interpretable models for equitable and effective educational strategies.

Keywords

Comparative Analysis; Educational Data Mining; Educational Predictive Modelling; (XAI); Feature Importance; Machine Learning Interpretability; Model Transparency; Predictive Analytics in Education; Student Adaptability; AI in Education Policy

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.