User-Centric Federated Learning: Trading off Wireless Resources for Personalization

Statistical heterogeneity across clients in a Federated Learning (FL) system increases the algorithm convergence time and reduces the generalization performance, resulting in a large communication overhead in return for a poor model. To tackle the above problems without violating the privacy constra...

Full description

Saved in:
Bibliographic Details
Main Authors: Mohamad Mestoukirdi, Matteo Zecchin, David Gesbert, Qianrui Li
Format: Article
Language:English
Published: IEEE 2023-01-01
Series:IEEE Transactions on Machine Learning in Communications and Networking
Subjects:
Online Access:https://ieeexplore.ieee.org/document/10286560/
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:Statistical heterogeneity across clients in a Federated Learning (FL) system increases the algorithm convergence time and reduces the generalization performance, resulting in a large communication overhead in return for a poor model. To tackle the above problems without violating the privacy constraints that FL imposes, personalized FL methods have to couple statistically similar clients without directly accessing their data in order to guarantee a privacy-preserving transfer. In this work, we design user-centric aggregation rules at the parameter server (PS) that are based on readily available gradient information and are capable of producing personalized models for each FL client. The proposed aggregation rules are inspired by an upper bound of the weighted aggregate empirical risk minimizer. Secondly, we derive a communication-efficient variant based on user clustering which greatly enhances its applicability to communication-constrained systems. Our algorithm outperforms popular personalized FL baselines in terms of average accuracy, worst node performance, and training communication overhead.
ISSN:2831-316X