Catfedavg: Optimising communication-efficiency and classification accuracy in federated learning

We propose CatFedAvg, a novel approach to federated learning that optimizes both communication efficiency and classification accuracy. Our method introduces categorical federated averaging, which strategically aggregates model updates while reducing communication overhead. Through extensive experimentation, we demonstrate that CatFedAvg achieves comparable accuracy to traditional federated learning approaches while significantly reducing communication costs.
Abstract
Federated Learning (FL) enables distributed model training across multiple devices while preserving data privacy. However, communication efficiency remains a significant challenge in FL deployments. This paper introduces CatFedAvg, a novel federated learning algorithm that optimizes the trade-off between communication costs and model performance. By leveraging categorical aggregation strategies, our approach reduces the communication overhead while maintaining competitive classification accuracy. We evaluate CatFedAvg on various datasets and demonstrate its effectiveness in reducing communication rounds while achieving comparable or better performance than baseline FL methods.