Preprint Article Version 1 Preserved in Portico This version is not peer-reviewed

DaFKD: Domain-aware Federated Knowledge Distillation

Version 1 : Received: 22 March 2023 / Approved: 24 March 2023 / Online: 24 March 2023 (13:30:08 CET)

How to cite: Wang, H.; Li, Y.; Xu, W.; Li, R.; Zhan, Y.; Zeng, Z. DaFKD: Domain-aware Federated Knowledge Distillation. Preprints 2023, 2023030432. https://doi.org/10.20944/preprints202303.0432.v1 Wang, H.; Li, Y.; Xu, W.; Li, R.; Zhan, Y.; Zeng, Z. DaFKD: Domain-aware Federated Knowledge Distillation. Preprints 2023, 2023030432. https://doi.org/10.20944/preprints202303.0432.v1

Abstract

Federated Distillation (FD) has recently attracted increasing attention for its efficiency in aggregating multiple diverse local models trained from statistically heterogeneous data of distributed clients. Existing FD methods generally treat these models equally by merely computing the average of their output soft predictions for some given input distillation sample, which does not take the diversity across all local models into account, thus leading to degraded performance of the aggregated model, especially when some local models learn little knowledge about the sample. In this paper, we propose a new perspective that treats the local data in each client as a specific domain and design a novel domain knowledge aware federated distillation method, dubbed DaFKD, that can discern the importance of each model to the distillation sample, and thus is able to optimize the ensemble of soft predictions from diverse models. Specifically, we employ a domain discriminator for each client, which is trained to identify the correlation factor between the sample and the corresponding domain. Then, to facilitate the training of the domain discriminator while saving communication costs, we propose sharing its partial parameters with the classification model. Extensive experiments on various datasets and settings show that the proposed method can improve the model accuracy by up to $6.02\%$ compared to state-of-the-art baselines.

Keywords

Federated Learning; Knowledge Distillation; Domain aware

Subject

Computer Science and Mathematics, Computer Networks and Communications

Comments (1)

Comment 1
Received: 25 March 2023
Commenter:
The commenter has declared there is no conflict of interests.
Comment: Accepted by CVPR 2023.
+ Respond to this comment

We encourage comments and feedback from a broad range of readers. See criteria for comments and our Diversity statement.

Leave a public comment
Send a private comment to the author(s)
* All users must log in before leaving a comment
Views 0
Downloads 0
Comments 1
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.
We use cookies on our website to ensure you get the best experience.
Read more about our cookies here.