Submitted:
14 April 2025
Posted:
15 April 2025
You are already at the latest version
Abstract
Keywords:
1. Introduction
1.1. Eye Tracking in Healthcare
1.2. Challenges and Limitations
1.3. State of the Art
1.4. Smartphone Based Eye Tracking
1.5. Contribution
2. Methods
2.1. Experiment Definition
2.2. Algorithm Development
2.3. Algorithm Validation and Error Avoidance
3. Results
3.1. CHT_ACM versus CHT_TM
3.2. With Fingers versus Without Fingers Condition
3.3. Comparison Among Tasks
3.4. Comparison Among Iris Colours
4. Discussion
4.1. Limitations
5. Conclusion
Supplementary Materials
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| AD | Alzheimer's Disease |
| AI | Artificial Intelligence |
| ALS | Amyotrophic Lateral Sclerosis |
| BSREC | Biomedical and Scientific Research Ethics Committee |
| CHT | Circular Hough Transform |
| CHT_ACM | CHT plus Active Contour Model |
| CHT_TM | CHT plus Template Matching |
| CNN | Convolutional Neural Network |
| DPI | Dual Purkinje Image |
| EOG | Electro-Oculography |
| fMRI | Functional Magnetic Resonance Imaging |
| IOG | Infrared Oculography |
| MAE | Mean Absolute Error |
| MPE | Mean Percentage Error |
| PCC | Pearson Correlation Coefficient |
| RMSR | Root Mean Square Error |
| ROI | Region of Interest |
| VOG | Video-Oculography |
References
- Mengtao, L.; Fan, L.; Gangyan, X.; Su, H. Leveraging eye-tracking technologies to promote aviation safety- A review of key aspects, challenges, and future perspectives. Safety Science 2023, 168, 106295. [Google Scholar] [CrossRef]
- Hansen, D.W.; Ji, Q. In the Eye of the Beholder: A Survey of Models for Eyes and Gaze. IEEE Transactions on Pattern Analysis and Machine Intelligence 2010, 32, 478–500. [Google Scholar] [CrossRef] [PubMed]
- Majaranta, P.; Bulling, A. Eye Tracking and Eye-Based Human–Computer Interaction. In Advances in Physiological Computing, Fairclough, S.H., Gilleade, K., Eds.; Springer London: London, 2014; pp. 39–65. [Google Scholar]
- Pouget, P. The cortex is in overall control of 'voluntary' eye movement. Eye (Lond) 2015, 29, 241–245. [Google Scholar] [CrossRef] [PubMed]
- Purves D, A.G., Fitzpatrick D, et al., editors. Types of Eye Movements and Their Functions. In Neuroscience. 2nd edition; Sinauer Associates: Sunderland (MA), 2001.
- Alexander, R.G.; Macknik, S.L.; Martinez-Conde, S. Microsaccades in Applied Environments: Real-World Applications of Fixational Eye Movement Measurements. J Eye Mov Res 2020, 12. [Google Scholar] [CrossRef]
- Mahanama, B.; Jayawardana, Y.; Rengarajan, S.; Jayawardena, G.; Chukoskie, L.; Snider, J.; Jayarathna, S. Eye movement and pupil measures: A review. frontiers in Computer Science 2022, 3, 733531. [Google Scholar] [CrossRef]
- Miki, S.; Hirata, Y. Microsaccades generated during car driving. In Proceedings of the 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), 2013-07-01, 2013.
- Bueno, A.P.A.; Sato, J.R.; Hornberger, M. Eye tracking–The overlooked method to measure cognition in neurodegeneration? Neuropsychologia 2019, 133, 107191. [Google Scholar] [CrossRef]
- Gooding, D.C.; Miller, M.D.; Kwapil, T.R. Smooth pursuit eye tracking and visual fixation in psychosis-prone individuals. Psychiatry Research 2000, 93, 41–54. [Google Scholar] [CrossRef]
- Garbutt, S.; Matlin, A.; Hellmuth, J.; Schenk, A.K.; Johnson, J.K.; Rosen, H.; Dean, D.; Kramer, J.; Neuhaus, J.; Miller, B.L. Oculomotor function in frontotemporal lobar degeneration, related disorders and Alzheimer's disease. Brain 2008, 131, 1268–1281. [Google Scholar] [CrossRef]
- Van Gerven, P.W.M.; Paas, F.; Van Merriënboer, J.J.G.; Schmidt, H.G. Memory load and the cognitive pupillary response in aging. Psychophysiology 2004, 41, 167–174. [Google Scholar] [CrossRef]
- Arnold, L.; Aryal, S.; Hong, B.; Nitharsan, M.; Shah, A.; Ahmed, W.; Lilani, Z.; Su, W.; Piaggio, D. A Systematic Literature Review of Eye-Tracking and Machine Learning Methods for Improving Productivity and Reading Abilities. Applied Sciences 2025, 15, 3308. [Google Scholar] [CrossRef]
- Iturrate, I.; Antelis, J.M.; Kubler, A.; Minguez, J. A noninvasive brain-actuated wheelchair based on a P300 neurophysiological protocol and automated navigation. IEEE transactions on robotics 2009, 25, 614–627. [Google Scholar] [CrossRef]
- Chang, T.P.; Zee, D.S.; Kheradmand, A. Technological advances in testing the dizzy patient: the bedside examination is still the key to successful diagnosis. Dizziness and Vertigo across the Lifespan 2019, 9–30. [Google Scholar]
- Kang, S.L.; Shaikh, A.G.; Ghasia, F.F. Vergence and strabismus in neurodegenerative disorders. Frontiers in neurology 2018, 9, 299. [Google Scholar] [CrossRef] [PubMed]
- Golding, C.; Danchaivijitr, C.; Hodgson, T.; Tabrizi, S.; Kennard, C. Identification of an oculomotor biomarker of preclinical Huntington disease. Neurology 2006, 67, 485–487. [Google Scholar] [CrossRef]
- Hicks, S.L.; Robert, M.P.; Golding, C.V.; Tabrizi, S.J.; Kennard, C. Oculomotor deficits indicate the progression of Huntington's disease. Progress in brain research 2008, 171, 555–558. [Google Scholar] [PubMed]
- García Cena, C.; Costa, M.C.; Saltarén Pazmiño, R.; Santos, C.P.; Gómez-Andrés, D.; Benito-León, J. Eye Movement Alterations in Post-COVID-19 Condition: A Proof-of-Concept Study. Sensors (Basel) 2022, 22. [Google Scholar] [CrossRef]
- Mergenthaler, K.; Engbert, R. Microsaccades are different from saccades in scene perception. Experimental Brain Research 2010, 203, 753–757. [Google Scholar] [CrossRef]
- Lai, H.Y.; Saavedra-Peña, G.; Sodini, C.G.; Sze, V.; Heldt, T. Measuring Saccade Latency Using Smartphone Cameras. IEEE Journal of Biomedical and Health Informatics 2020, 24, 885–897. [Google Scholar] [CrossRef]
- Sangi, M.; Thompson, B.; Turuwhenua, J. An Optokinetic Nystagmus Detection Method for Use With Young Children. IEEE Journal of Translational Engineering in Health and Medicine 2015, 3, 1–10. [Google Scholar] [CrossRef]
- Piaggio, D.; Namm, G.; Melillo, P.; Simonelli, F.; Iadanza, E.; Pecchia, L. Pupillometry via smartphone for low-resource settings. Biocybernetics and Biomedical Engineering 2021, 41, 891–902. [Google Scholar] [CrossRef]
- Ilesanmi, A.E.; Ilesanmi, T.; Gbotoso, G.A. A systematic review of retinal fundus image segmentation and classification methods using convolutional neural networks. Healthcare Analytics 2023, 4, 100261. [Google Scholar] [CrossRef]
- Newman, J.L.; Phillips, J.S.; Cox, S.J.; Fitzgerald, J.; Bath, A. Automatic nystagmus detection and quantification in long-term continuous eye-movement data. Computers in Biology and Medicine 2019, 114, 103448. [Google Scholar] [CrossRef] [PubMed]
- Alichniewicz, K.K.; Brunner, F.; Klünemann, H.H.; Greenlee, M.W. Neural correlates of saccadic inhibition in healthy elderly and patients with amnestic mild cognitive impairment. Frontiers in Psychology 2013, 4, 467. [Google Scholar] [CrossRef]
- Witiuk, K.; Fernandez-Ruiz, J.; McKee, R.; Alahyane, N.; Coe, B.C.; Melanson, M.; Munoz, D.P. Cognitive deterioration and functional compensation in ALS measured with fMRI using an inhibitory task. Journal of Neuroscience 2014, 34, 14260–14271. [Google Scholar] [CrossRef] [PubMed]
- Hutton, S.B. Eye tracking methodology. Eye movement research: An introduction to its scientific foundations and applications 2019, 277-308.
- Underwood, C. Electronystagmography (ENG). Available online: https://www.healthline.com/health/electronystagmography (accessed on 31 Jan).
- Cornsweet, T.N.; Crane, H.D. Accurate two-dimensional eye tracker using first and fourth Purkinje images. JOSA 1973, 63, 921–928. [Google Scholar] [CrossRef]
- Guestrin, E.D.; Eizenman, M. General theory of remote gaze estimation using the pupil center and corneal reflections. IEEE Transactions on biomedical engineering 2006, 53, 1124–1133. [Google Scholar] [CrossRef] [PubMed]
- Wildenmann, U.; Schaeffel, F. Variations of pupil centration and their effects on video eye tracking. Ophthalmic and Physiological Optics 2013, 33, 634–641. [Google Scholar] [CrossRef]
- Azami, H.; Chang, Z.; Arnold, S.E.; Sapiro, G.; Gupta, A.S. Detection of Oculomotor Dysmetria From Mobile Phone Video of the Horizontal Saccades Task Using Signal Processing and Machine Learning Approaches. IEEE Access 2022, 10, 34022–34031. [Google Scholar] [CrossRef]
- Valliappan, N.; Dai, N.; Steinberg, E.; He, J.; Rogers, K.; Ramachandran, V.; Xu, P.; Shojaeizadeh, M.; Guo, L.; Kohlhoff, K.; et al. Accelerating eye movement research via accurate and affordable smartphone eye tracking. Nature Communications 2020, 11, 4553. [Google Scholar] [CrossRef]
- Gunawardena, N.; Ginige, J.A.; Javadi, B. Eye-tracking Technologies in Mobile Devices Using Edge Computing: A Systematic Review. ACM Comput. Surv. 2022, 55, Article 158. [Google Scholar] [CrossRef]
- Otoom, M.; Alzubaidi, M.A. Ambient intelligence framework for real-time speech-to-sign translation. Assistive Technology 2018, 30, 119–132. [Google Scholar] [CrossRef] [PubMed]
- Molina-Cantero, A.J.; Lebrato-Vázquez, C.; Castro-García, J.A.; Merino-Monge, M.; Biscarri-Triviño, F.; Escudero-Fombuena, J.I. A review on visible-light eye-tracking methods based on a low-cost camera. Journal of Ambient Intelligence and Humanized Computing 2024, 15, 2381–2397. [Google Scholar] [CrossRef]
- Yang, B.; Zhang, X.; Li, Z.; Du, S.; Wang, F. An accurate and robust gaze estimation method based on maximum correntropy criterion. IEEE Access 2019, 7, 23291–23302. [Google Scholar] [CrossRef]
- Hammal, Z.; Massot, C.; Bedoya, G.; Caplier, A. Eyes Segmentation Applied to Gaze Direction and Vigilance Estimation. Berlin, Heidelberg, 2005; pp. 236-246.
- Liu, Y.; Lee, B.-S.; Rajan, D.; Sluzek, A.; McKeown, M.J. CamType: assistive text entry using gaze with an off-the-shelf webcam. Machine Vision and Applications 2019, 30, 407–421. [Google Scholar] [CrossRef]
- Liu, Y.; Lee, B.S.; Sluzek, A.; Rajan, D.; Mckeown, M. Feasibility Analysis of Eye Typing with a Standard Webcam. Cham, 2016; pp. 254-268.
- Valenti, R.; Staiano, J.; Sebe, N.; Gevers, T. Webcam-Based Visual Gaze Estimation. Berlin, Heidelberg, 2009; pp. 662-671.
- Jankó, Z.; Hajder, L. Improving human-computer interaction by gaze tracking. In Proceedings of the 2012 IEEE 3rd International Conference on Cognitive Infocommunications (CogInfoCom); 2012; pp. 155–160. [Google Scholar]
- Wojke, N.; Hedrich, J.; Droege, D.; Paulus, D. Gaze-estimation for consumer-grade cameras using a Gaussian process latent variable model. Pattern Recognition and Image Analysis 2016, 26, 248–255. [Google Scholar] [CrossRef]
- Cheng, S.; Ping, Q.; Wang, J.; Chen, Y. EasyGaze: Hybrid eye tracking approach for handheld mobile devices. Virtual Reality & Intelligent Hardware 2022, 4, 173–188. [Google Scholar]
- Jariwala, K.; Dalal, U.; Vincent, A. A robust eye gaze estimation using geometric eye features. In Proceedings of the 2016 Third International Conference on Digital Information Processing, Data Mining, and Wireless Communications (DIPDMWC), 6-8 July 2016, 2016; pp. 142-147.
- Tobii. Tobii Eye tracker 5. Available online: https://gaming.tobii.com/product/eye-tracker-5/ (accessed on 18 June).
- Gazepoint. GP3 Eye Tracker | Hardware Only. Available online: https://www.gazept.com/product/gazepoint-gp3-eye-tracker/?v=79cba1185463 (accessed on 18 June).
- Batchelor, B.G.; Waltz, F.M. Morphological Image Processing. In Machine Vision Handbook, Batchelor, B.G., Ed.; Springer London: London, 2012; pp. 801–870. [Google Scholar]
- Cherabit, N.; Chelali, F.Z.; Djeradi, A. Circular hough transform for iris localization. Science and Technology 2012, 2, 114–121. [Google Scholar] [CrossRef]
- ctrueden. Hough Circle Transform. Available online: https://imagej.net/plugins/hough-circle-transform (accessed on 9 April).
- Yousefi, J. Image binarization using Otsu thresholding algorithm. Ontario, Canada: University of Guelph 2011, 10.
- Kass, M.; Witkin, A.; Terzopoulos, D. Snakes: Active contour models. International Journal of Computer Vision 1988, 1, 321–331. [Google Scholar] [CrossRef]
- Brunelli, R. Template Matching Techniques in Computer Vision: Theory and Practice; Wiley Publishing: 2009.
- Zhang, W.; Wu, H.; Liu, Y.; Zheng, S.; Liu, Z.; Li, Y.; Zhao, Y.; Zhu, Z. Deep learning based torsional nystagmus detection for dizziness and vertigo diagnosis. Biomedical Signal Processing and Control 2021, 68, 102616. [Google Scholar] [CrossRef]
- Hassan, M.A.; Yin, X.; Zhuang, Y.; Aldridge, C.M.; McMurry, T.; Southerland, A.M.; Rohde, G.K. A Digital Camera-Based Eye Movement Assessment Method for NeuroEye Examination. IEEE Journal of Biomedical and Health Informatics 2024, 28, 655–665. [Google Scholar] [CrossRef]







| Technologies | Advantages | Disadvantages | ||
|---|---|---|---|---|
| fMRI and other technology | For specific purposes | Without multiple eye measures | ||
| Eye Tracking | EOG | Cheap, well established, readily available, high temporal resolution, no need for eyes to be open | Low accuracy and safety | |
| VOG | Stationary | High specifications | Need stimuli (likely in research scenarios) | |
| Mobile | No need for head stability, real world tasks, lightweight | Low data quality, low sampling rates and poor spatial resolution | ||
| Infra-red Cameras (pupil centre) | Not influenced by the environment light levels | Pupil change error (pupils do not dilate or constrict symmetrically around their centre) | ||
| CHT_TM | CHT_ACM | |
|---|---|---|
| MAE for x | 1.432* | 1.774 |
| MAE for y | 1.752* | 2.079 |
| MPE for x | 1.059* | 1.290 |
| MPE for y | 1.213* | 1.423 |
| RMSE for x | 2.362* | 2.514 |
| RMSE for y | 2.554* | 3.069 |
| Pearsons r and pval for x | 0.993, <0.0001 | 0.993, <0.0001 |
| Pearsons r and pval for y | 0.990, <0.0001 | 0.986, <0.0001 |
| No finger | MAE | MPE | RMSE | PCC_r |
|---|---|---|---|---|
| x_CHT_TM | 0.467* | 0.304* | 0.748* | 0.999 |
| y_CHT_TM | 1.697 | 1.241 | 2.396 | 0.984 |
| x_CHT_ACM | 1.36 | 0.904 | 1.997 | 0.995 |
| y_CHT_ACM | 1.337 | 0.969 | 1.876 | 0.990 |
| Finger | MAE | MPE | RMSE | PCC_r |
| x_CHT_TM | 0.541 | 0.403 | 0.849 | 0.999 |
| y_CHT_TM | 0.739* | 0.476* | 0.995* | 0.997 |
| x_CHT_ACM | 0.901* | 0.653* | 1.252* | 0.998 |
| y_CHT_ACM | 1.009* | 0.653* | 1.760* | 0.993 |
| Vertical | MAE | MPE | RMSE | PCC_r |
|---|---|---|---|---|
| x_CHT_TM | 1.291 | 0.911 | 2.175 | 0.995 |
| y_CHT_TM | 1.659 | 1.153 | 2.190 | 0.992 |
| x_CHT_ACM | 1.665 | 1.147* | 2.237* | 0.996 |
| y_CHT_ACM | 2.015 | 1.367 | 2.786 | 0.984 |
| Horizontal | MAE | MPE | RMSE | PCC_r |
| x_CHT_TM | 1.020* | 0.726* | 1.553* | 0.997 |
| y_CHT_TM | 1.672 | 1.184 | 2.520 | 0.964 |
| x_CHT_ACM | 2.128 | 1.516 | 3.014 | 0.988 |
| y_CHT_ACM | 2.298 | 1.588 | 3.489 | 0.934 |
| Circular | MAE | MPE | RMSE | PCC_r |
| x_CHT_TM | 1.755 | 1.260 | 2.925 | 0.985 |
| y_CHT_TM | 2.054 | 1.386 | 3.039 | 0.993 |
| x_CHT_ACM | 1.747 | 1.273 | 2.416 | 0.990 |
| y_CHT_ACM | 2.009 | 1.347* | 3.086 | 0.993 |
| Fixation | MAE | MPE | RMSE | PCC_r |
| x_CHT_TM | 1.537 | 1.353 | 2.182 | 0.999 |
| y_CHT_TM | 1.276* | 0.922* | 1.665* | 0.947 |
| x_CHT_ACM | 1.428* | 1.186 | 2.246 | 0.996 |
| y_CHT_ACM | 1.997* | 1.427 | 2.685* | 0.850 |
| MAE | MPE | RMSE | PCC_r | |
|---|---|---|---|---|
| x_CHT_TM | Horizontal/Circle | Horizontal/Fixation | Horizontal/Circle | Fixation/Circle |
| y_CHT_TM | Fixation/Circle | Fixation/Circle | Fixation/Circle | Circle/Fixation |
| x_CHT_ACM | Fixation/Horizontal | Vertical/Horizontal | Vertical/Horizontal | Vertical/Horizontal |
| y_CHT_ACM | Fixation/Horizontal | Circle/Horizontal | Fixation/Horizontal | Circle/Fixation |
| Subject 1 | MAE | MPE | RMSE | PCC_r |
|---|---|---|---|---|
| x_CHT_TM | 0.497* | 0.343* | 0.789* | 0.999 |
| y_CHT_TM | 1.325* | 0.945* | 1.974 | 0.992 |
| x_CHT_ACM | 1.182* | 0.807* | 1.746* | 0.996 |
| y_CHT_ACM | 1.210* | 0.847* | 1.832* | 0.993 |
| Subject 2 | MAE | MPE | RMSE | PCC_r |
| x_CHT_TM | 2.791 | 1.985 | 3.732 | 0.972 |
| y_CHT_TM | 2.814 | 1.759 | 3.686 | 0.973 |
| x_CHT_ACM | 2.528 | 1.802 | 3.291 | 0.981 |
| y_CHT_ACM | 3.689 | 2.329 | 4.695 | 0.924 |
| Subject 3 | MAE | MPE | RMSE | PCC_r |
| x_CHT_TM | 1.822 | 1.493 | 2.512 | 0.988 |
| y_CHT_TM | 1.415 | 1.143 | 1.957* | 0.991 |
| x_CHT_ACM | 2.148 | 1.712 | 2.823 | 0.990 |
| y_CHT_ACM | 2.037 | 1.583 | 2.734 | 0.980 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).