Article
Version 1
Preserved in Portico This version is not peer-reviewed
Federated Distillation Methodology for Label-based Group Structures
Version 1
: Received: 11 October 2023 / Approved: 11 October 2023 / Online: 12 October 2023 (09:32:31 CEST)
A peer-reviewed article of this Preprint also exists.
Yang, G.; Tae, H. Federated Distillation Methodology for Label-Based Group Structures. Appl. Sci. 2024, 14, 277. Yang, G.; Tae, H. Federated Distillation Methodology for Label-Based Group Structures. Appl. Sci. 2024, 14, 277.
Abstract
In federated learning (FL), clients train models locally without sharing raw data, ensuring data privacy. In particular, federated distillation transfers knowledge to clients regardless of the model architecture. However, when groups of clients with different data distributions exist, sharing the same knowledge among all clients becomes impractical. To address this issue, this paper presents an approach that clusters clients based on the output of a client model trained using their own data. The clients are clustered based on the predictions of their models for each label on a public dataset. Prior knowledge of the number of groups is not required. Evaluations on the MNIST dataset showed that our method accurately identified these group structures and improved accuracy by 15–75% compared with traditional federated distillation algorithms when distinct group structures were present. Additionally, we observed significant performance improvements in smaller client groups, bringing us closer to fair FL.
Keywords
federated learning; distillation; federated distillation; clustering
Subject
Computer Science and Mathematics, Artificial Intelligence and Machine Learning
Copyright: This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Comments (0)
We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.
Leave a public commentSend a private comment to the author(s)
* All users must log in before leaving a comment