Communication-Efficient Federated Learning over MIMO Multiple Access Channels
Communication efficiency is of importance for wireless federated learning systems. In this paper, we propose a communication-efficient strategy for federated learning over multiple-input multiple-output (MIMO) multiple access channels (MACs). The proposed strategy comprises two components. When sending a locally computed gradient, each device compresses a high dimensional local gradient to multiple lower-dimensional gradient vectors using block sparsification. When receiving a superposition of the compressed local gradients via a MIMO-MAC, a parameter server (PS) performs a joint MIMO detection and the sparse local-gradient recovery. Inspired by the turbo decoding principle, our joint detection-and-recovery algorithm accurately recovers the high-dimensional local gradients by iteratively exchanging their beliefs for MIMO detection and sparse local gradient recovery outputs. We then analyze the reconstruction error of the proposed algorithm and its impact on the convergence rate of federated learning. From simulations, our gradient compression and joint detection-and-recovery methods diminish the communication cost significantly while achieving identical classification accuracy for the case without any compression.
READ FULL TEXT