Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

Similarity Based Framework for Unsupervised Domain Adaptation: Peer Reviewing Policy for Pseudo-Labeling

Version 1 : Received: 25 July 2023 / Approved: 26 July 2023 / Online: 27 July 2023 (07:57:13 CEST)

A peer-reviewed article of this Preprint also exists.

Arweiler, J.; Ates, C.; Cerquides, J.; Koch, R.; Bauer, H.-J. Similarity-Based Framework for Unsupervised Domain Adaptation: Peer Reviewing Policy for Pseudo-Labeling. Mach. Learn. Knowl. Extr. 2023, 5, 1474-1492. Arweiler, J.; Ates, C.; Cerquides, J.; Koch, R.; Bauer, H.-J. Similarity-Based Framework for Unsupervised Domain Adaptation: Peer Reviewing Policy for Pseudo-Labeling. Mach. Learn. Knowl. Extr. 2023, 5, 1474-1492.

Abstract

The inherent dependency of deep learning models to labeled data is a well-known problem and one of the barriers that slows down the integration of such methods into different fields of applied sciences and engineering, in which experimental and numerical methods can easily generate a colossal amount of unlabeled data. This paper proposes an unsupervised domain adaptation methodology that mimics the peer review process to label new observations in a different domain from the training set. The approach evaluates the validity of a hypothesis using domain knowledge acquired from the training set through a similarity analysis, exploring the projected feature space to examine the class centroid shifts. The methodology is tested on a binary classification problem, where synthetic images of cubes and cylinders in different orientations are generated. The methodology improves the accuracy of the object classifier from 60% to around 90% in the case of a domain shift in physical feature space without human labeling.

Keywords

unsupervised domain adaptation; pseudo-labeling; transfer learning

Subject

Computer Science and Mathematics, Artificial Intelligence and Machine Learning

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.