ARTICLE | doi:10.20944/preprints202209.0176.v1
Subject: Mathematics & Computer Science, Applied Mathematics Keywords: Differential privacy; Federated learning; Vertically partitioned data
Online: 13 September 2022 (10:57:53 CEST)
We present a differentially private extension of the block coordinate descent based on objective perturbation. The algorithm iteratively performs linear regression in a federated setting on vertically partitioned data. In addition to a privacy guarantee, the algorithm also offers a utility guarantee; a tolerance parameter indicates how much the differentially private regression may deviate from an analysis without differential privacy. The algorithm’s performance is compared with the standard block coordinate descent algorithm and the trade-off between utility and privacy is studied. The performance is studied using both artificial test data and the forest fires data set. We find that the algorithm is fast and able to generate practical predictions with single-digit privacy budgets, albeit with some accuracy loss.