Article
Version 1
Preserved in Portico This version is not peer-reviewed
Implicit Calibration Using Probable Fixation Targets
Version 1
: Received: 8 November 2018 / Approved: 12 November 2018 / Online: 12 November 2018 (05:02:54 CET)
A peer-reviewed article of this Preprint also exists.
Kasprowski, P.; Harezlak, K.; Skurowski, P. Implicit Calibration Using Probable Fixation Targets. Sensors 2019, 19, 216. Kasprowski, P.; Harezlak, K.; Skurowski, P. Implicit Calibration Using Probable Fixation Targets. Sensors 2019, 19, 216.
Abstract
Proper calibration of eye movement signal registered by an eye tracker seems to be one of the main challenges in popularizing eye trackers as yet another user input device. Classic calibration methods taking time and imposing unnatural behavior of users have to be replaced by intelligent methods that are able to calibrate the signal without conscious cooperation with users. Such an implicit calibration requires some knowledge about the stimulus a person is looking at and takes into account this information to predict probable gaze targets. The paper describes one of the possible methods to perform implicit calibration: it starts with finding probable fixation targets (PFTs), then uses these targets to build a mapping - probable gaze path. Various possible algorithms that may be used for finding PFTs and mapping are presented in the paper and errors are calculated utilizing two datasets registered with two different types of eye trackers. The results show that although for now the implicit calibration provides results worse than the classic one, it may be comparable with it and sufficient for some applications.
Keywords
eye tracking; calibration; eye movement; optimization
Subject
Computer Science and Mathematics, Computer Science
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment