Preprint Article Version 1 This version is not peer-reviewed

Averaging is Probably not the Optimum Way of Aggregating Parameters in Federated Learning

Version 1 : Received: 17 January 2020 / Approved: 19 January 2020 / Online: 19 January 2020 (04:37:27 CET)

How to cite: Xiao, P.; Cheng, S.; Stankovic, V.; Vukobratovic, D. Averaging is Probably not the Optimum Way of Aggregating Parameters in Federated Learning. Preprints 2020, 2020010207 (doi: 10.20944/preprints202001.0207.v1). Xiao, P.; Cheng, S.; Stankovic, V.; Vukobratovic, D. Averaging is Probably not the Optimum Way of Aggregating Parameters in Federated Learning. Preprints 2020, 2020010207 (doi: 10.20944/preprints202001.0207.v1).

Abstract

Federated learning is a decentralized topology of deep learning, that trains a shared model through data distributed among each client (like mobile phones, wearable devices), in order to ensure data privacy by avoiding raw data exposed in data center (server). After each client computes a new model parameter by stochastic gradient decrease (SGD) based on their own local data, all locally-computed parameters will be aggregated in the server to generate an updated global model. Almost all current studies directly average different client computed parameters by default, but no one gives an explanation why averaging parameters is a good approach. In this paper, we treat each client computed parameter as a random vector because of the stochastic properties of SGD, and estimate mutual information between two client computed parameters at different training phases using two methods in two learning tasks. The results confirm the correlation between different clients and show an increasing trend of mutual information with training iteration. However, when we further compute the distance between client computed parameters, we find that parameters are getting more correlated while not getting closer. This phenomenon suggests that averaging parameters may not be the optimum way of aggregating trained parameters.

Subject Areas

federated learning; federated averaging; mutual information; correlation

Comments (0)

We encourage comments and feedback from a broad range of readers. See criteria for comments and our diversity statement.

Leave a public comment
Send a private comment to the author(s)
Views 0
Downloads 0
Comments 0
Metrics 0


×
Alerts
Notify me about updates to this article or when a peer-reviewed version is published.