Federated Learning has emerged as a promising paradigm for training machine learning models while preserving data privacy. However, handling class imbalance in federated settings remains challenging. This work introduces Fed-Focal Loss, a novel approach that adapts focal loss for federated learning scenarios to address data imbalance across distributed clients.
Nov 1, 2020
We propose CatFedAvg, a novel approach to federated learning that optimizes both communication efficiency and classification accuracy. Our method introduces categorical federated averaging, which strategically aggregates model updates while reducing communication overhead.
Nov 1, 2020