Share your thoughts, 1 month free Claude Pro on usSee more
WorkDL logo mark

FedHB: Hierarchical Bayesian Federated Learning

About

We propose a novel hierarchical Bayesian approach to Federated Learning (FL), where our model reasonably describes the generative process of clients' local data via hierarchical Bayesian modeling: constituting random variables of local models for clients that are governed by a higher-level global variate. Interestingly, the variational inference in our Bayesian model leads to an optimisation problem whose block-coordinate descent solution becomes a distributed algorithm that is separable over clients and allows them not to reveal their own private data at all, thus fully compatible with FL. We also highlight that our block-coordinate algorithm has particular forms that subsume the well-known FL algorithms including Fed-Avg and Fed-Prox as special cases. Beyond introducing novel modeling and derivations, we also offer convergence analysis showing that our block-coordinate FL algorithm converges to an (local) optimum of the objective at the rate of $O(1/\sqrt{t})$, the same rate as regular (centralised) SGD, as well as the generalisation error analysis where we prove that the test error of our model on unseen data is guaranteed to vanish as we increase the training data size, thus asymptotically optimal.

Minyoung Kim, Timothy Hospedales• 2023

Related benchmarks

TaskDatasetResultRank
Image ClassificationEMNIST
Accuracy88.97
82
PersonalizationCIFAR-100 (test)
Accuracy82.71
80
ClassificationFMNIST
Accuracy84.28
53
Global PredictionCIFAR-100 original uncorrupted (train)
Accuracy41.64
48
Image ClassificationCIFAR-C-100 10 training corruption types (test)
Accuracy91.26
48
Personalization PerformanceCIFAR-C-100 (9 held-out corruption types)
Accuracy84.88
48
Global PredictionCIFAR-100 initial (test)
Accuracy57.8
24
Global prediction performanceMNIST
Accuracy97.94
20
Image ClassificationFMNIST
Accuracy92.54
20
Showing 9 of 9 rows

Other info

Follow for update