Article
Version 1
This version is not peer-reviewed
Handling Incomplete Instance Annotations via Asymmetric Loss Function
Version 1
: Received: 14 January 2021 / Approved: 15 January 2021 / Online: 15 January 2021 (15:44:51 CET)
How to cite: Chen, F.; Pound, M.; French, A. Handling Incomplete Instance Annotations via Asymmetric Loss Function. Preprints 2021, 2021010300 Chen, F.; Pound, M.; French, A. Handling Incomplete Instance Annotations via Asymmetric Loss Function. Preprints 2021, 2021010300
Abstract
Annotating training data is a time consuming and labor intensive process in deep learning, especially for images with many objects present. In this paper, we propose a method to allow deep networks to be trained on data with reduced numbers of annotations (per image) in heatmap regression tasks (e.g. object detection and counting), by applying an asymmetric loss function. In a real scenario, this reduction of annotations can be imposed by the researchers (e.g. ask the annotators to label only 50% of what they see in each image), or can potentially counteract unintentionally missing labels from the annotators. To demonstrate the effectiveness of our method, we conduct experiments in two domains, crowd counting and wheat spikelet detection, using different deep network architecture. We drop various percentages of instance annotations per image in training. Results show that an asymmetric loss function is effective across different models and datasets, even in very extreme cases with limited annotations provided (e.g. 90% of the original annotations reduced). Whilst tuning of the key parameters are required, we find that setting conservative parameter values can help more realistic situations, where only small amounts of data have been missed by annotators.
Keywords
Deep Learning; Reducing Training Annotations per Image; Object Detection; Object Counting; Asymmetric Loss Function
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment