Preprint
Article

Assessing Visual Science Literacy Using Functional Near-Infrared Spectroscopy with an Artificial Neural Network

This version is not peer-reviewed.

Submitted:

16 October 2021

Posted:

19 October 2021

You are already at the latest version

Abstract
The primary barrier to understanding visual and abstract information in STEM fields is representational competence the ability to generate, transform, analyze and explain representations. The relationship is known between the foundational visual literacy and the domain specific science literacy, however how science literacy is a function of science learning is still not well understood despite investigation across many fields. To support the improvement of students’ representational competence and promote learning in science, identification of visualization skills is necessary. This project details the development of an artificial neural network (ANN) capable of measuring and modeling visual science literacy (VSL) via neurological measurements using functional near infrared spectrometry (fNIRS). The developed model has the capacity to classify levels of scientific visual literacy allowing educators and curriculum designers the ability to create more targeted and immersive classroom resources such as virtual reality, to enhance the fundamental visual tools in science.
Keywords: 
;  ;  ;  ;  ;  ;  
Copyright: This open access article is published under a Creative Commons CC BY 4.0 license, which permit the free download, distribution, and reuse, provided that the author and preprint are cited in any reuse.

Downloads

260

Views

261

Comments

0

Subscription

Notify me about updates to this article or when a peer-reviewed version is published.

Email

Prerpints.org logo

Preprints.org is a free preprint server supported by MDPI in Basel, Switzerland.

Subscribe

© 2025 MDPI (Basel, Switzerland) unless otherwise stated