FedCL: Federated Multi-Phase Curriculum Learning to Synchronously Correlate User Heterogeneity
Federated Learning (FL) is a new decentralized learning used for training machine learning algorithms where a global model iteratively gathers the parameters of local models but does not access their local data. A key challenge in FL is to handle the heterogeneity of local data distribution, resulting in a drifted global model, which is hard to converge. To cope with this challenge, current methods adopt different strategies like knowledge distillation, weighted model aggregation, and multi-task learning, as regulation. We refer to these approaches as asynchronous FL since they align user models in either a local or post-hoc manner where model drift has already happened or has been underestimated. In this paper, we propose an active and synchronous correlation approach to solve the challenge of user heterogeneity in FL. Specifically, we aim to approximate FL as the standard deep learning by actively and synchronously scheduling user learning pace in each round with a dynamic multi-phase curriculum. A global curriculum ensembles all user curriculum on its server by the auto-regressive auto-encoder. Then the global curriculum is divided into multiple phases and broadcast to users to measure and align the domain-agnostic learning pace. Empirical studies demonstrate that our approach equips FL with state-of-the-art generalization performance over existing asynchronous approaches, even facing severe user heterogeneity.
READ FULL TEXT