Preprint Dataset Version 1 Preserved in Portico This version is not peer-reviewed

Dataset for Eye-Tracking Tasks

Version 1 : Received: 1 December 2020 / Approved: 2 December 2020 / Online: 2 December 2020 (08:00:46 CET)

How to cite: Rakhmatulin, I. Dataset for Eye-Tracking Tasks. Preprints 2020, 2020120047. https://doi.org/10.20944/preprints202012.0047.v1 Rakhmatulin, I. Dataset for Eye-Tracking Tasks. Preprints 2020, 2020120047. https://doi.org/10.20944/preprints202012.0047.v1

Abstract

In recent years many different deep neural networks were developed, but due to a large number of layers in deep networks, their training requires a long time and a large number of datasets. Today is popular to use trained deep neural networks for various tasks, even for simple ones in which such deep networks are not required. The well-known deep networks such as YoloV3, SSD, etc. are intended for tracking and monitoring various objects, therefore their weights are heavy and the overall accuracy for a specific task is low. Eye-tracking tasks need to detect only one object - an iris in a given area. Therefore, it is logical to use a neural network only for this task. But the problem is the lack of suitable datasets for training the model. In the manuscript, we presented a dataset that is suitable for training custom models of convolutional neural networks for eye-tracking tasks. Using data set data, each user can independently pre-train the convolutional neural network models for eye-tracking tasks. This dataset contains annotated 10,000 eye images in an extension of 416 by 416 pixels. The table with annotation information shows the coordinates and radius of the eye for each image. This manuscript can be considered as a guide for the preparation of datasets for eye-tracking devices.

Keywords

eye tracking dataset; gaze tracking dataset; iris tracking dataset; CNN for eye-tracking; neural networks for eye-tracking

Subject

Engineering, Automotive Engineering

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.