Do Written Responses to Open-Ended Questions on Fourth-Grade Formative Assessments in Mathematics Help Predict Scores on End-of-Year Standardized Tests?
: Predicting long-term student learning is a critical task for teachers and for educational data mining. However, most of the models do not consider two typical situations in real-life classrooms. The first is that teachers develop their own questions for formative assessment. Therefore, there are a huge number of possible questions, each of which is answered by only a few students. Second, formative assessment often involved open-ended questions that students answer in writing. These types of questions in formative assessment are highly valuable. However, analyzing the responses automatically can be a complex process. In this paper, we address these two challenges. We analyzed 621,575 answers to closed-ended questions and 16,618 answers to open-ended questions by 464 fourth-graders from 24 low-SES schools. We constructed a classifier to detect incoherent responses to open-ended mathematics questions. We then used it in a model to predict scores on an end-of-year national standardized test. We found that despite answering 36.4 times fewer open-ended questions than closed questions, including features of the students’ open responses in our model improved our prediction of their end-of-year test scores. To the best of our knowledge, this is the first time that a predictor of end-of-year test scores has been improved by using automatically detected features of answers to open-ended questions on formative assessments.
Keywords:
Subject: Social Sciences - Education
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.
Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.