Knowledge distillation in federated learning: a comprehensive survey

Abstract Federated Learning, often known as FL, is an approach that has recently emerged as a potentially helpful method for training machine learning models in a distributed manner without the requirement of central data storage. However, when attempting to aggregate information, the inherent varie...

Full description

Saved in:
Bibliographic Details
Main Authors: Hassan Salman, Chamseddine Zaki, Nour Charara, Sonia Guehis, Jean-François Pradat-Peyre, Abbass Nasser
Format: Article
Language:English
Published: Springer 2025-07-01
Series:Discover Computing
Subjects:
Online Access:https://doi.org/10.1007/s10791-025-09657-4
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1849761485437272064
author Hassan Salman
Chamseddine Zaki
Nour Charara
Sonia Guehis
Jean-François Pradat-Peyre
Abbass Nasser
author_facet Hassan Salman
Chamseddine Zaki
Nour Charara
Sonia Guehis
Jean-François Pradat-Peyre
Abbass Nasser
author_sort Hassan Salman
collection DOAJ
description Abstract Federated Learning, often known as FL, is an approach that has recently emerged as a potentially helpful method for training machine learning models in a distributed manner without the requirement of central data storage. However, when attempting to aggregate information, the inherent variety and discrepancies in the data contributed by many FL contributors might be a substantial obstacle. In order to address this problem, researchers have offered various solutions, one of which is called knowledge distillation (KD). Such a solution seeks to transfer knowledge from a larger, more precise model to a smaller model, thus enhancing its performance. This study provides a detailed examination of the effectiveness of KD in responding to these challenges posed by FL. We comprehensively review existing research, emphasizing the benefits and limitations of using these techniques in FL and discussing the numerous challenges and research questions in this field.
format Article
id doaj-art-b4d7bb567632437791e6b5fa4a97a2c4
institution DOAJ
issn 2948-2992
language English
publishDate 2025-07-01
publisher Springer
record_format Article
series Discover Computing
spelling doaj-art-b4d7bb567632437791e6b5fa4a97a2c42025-08-20T03:06:01ZengSpringerDiscover Computing2948-29922025-07-0128114010.1007/s10791-025-09657-4Knowledge distillation in federated learning: a comprehensive surveyHassan Salman0Chamseddine Zaki1Nour Charara2Sonia Guehis3Jean-François Pradat-Peyre4Abbass Nasser5LIP6 UMR 7606 Sorbonne Université – CNRSCollege of Engineering and Technology, American University of the Middle EastICCS-Lab Computer Science Department, Faculty of Arts and Science, American University of Culture and EducationLAMSADE, Paris-Dauphine University, PSL Research University, CNRS UMR 7243LIP6 UMR 7606 Sorbonne Université – CNRSBusiness School, Holy-Spirit University of Kaslik (USEK)Abstract Federated Learning, often known as FL, is an approach that has recently emerged as a potentially helpful method for training machine learning models in a distributed manner without the requirement of central data storage. However, when attempting to aggregate information, the inherent variety and discrepancies in the data contributed by many FL contributors might be a substantial obstacle. In order to address this problem, researchers have offered various solutions, one of which is called knowledge distillation (KD). Such a solution seeks to transfer knowledge from a larger, more precise model to a smaller model, thus enhancing its performance. This study provides a detailed examination of the effectiveness of KD in responding to these challenges posed by FL. We comprehensively review existing research, emphasizing the benefits and limitations of using these techniques in FL and discussing the numerous challenges and research questions in this field.https://doi.org/10.1007/s10791-025-09657-4Federated LearningKnowledge distillationTransfer LearningData HeterogeneityModel HeterogeneityNon-independent-identical Distribution
spellingShingle Hassan Salman
Chamseddine Zaki
Nour Charara
Sonia Guehis
Jean-François Pradat-Peyre
Abbass Nasser
Knowledge distillation in federated learning: a comprehensive survey
Discover Computing
Federated Learning
Knowledge distillation
Transfer Learning
Data Heterogeneity
Model Heterogeneity
Non-independent-identical Distribution
title Knowledge distillation in federated learning: a comprehensive survey
title_full Knowledge distillation in federated learning: a comprehensive survey
title_fullStr Knowledge distillation in federated learning: a comprehensive survey
title_full_unstemmed Knowledge distillation in federated learning: a comprehensive survey
title_short Knowledge distillation in federated learning: a comprehensive survey
title_sort knowledge distillation in federated learning a comprehensive survey
topic Federated Learning
Knowledge distillation
Transfer Learning
Data Heterogeneity
Model Heterogeneity
Non-independent-identical Distribution
url https://doi.org/10.1007/s10791-025-09657-4
work_keys_str_mv AT hassansalman knowledgedistillationinfederatedlearningacomprehensivesurvey
AT chamseddinezaki knowledgedistillationinfederatedlearningacomprehensivesurvey
AT nourcharara knowledgedistillationinfederatedlearningacomprehensivesurvey
AT soniaguehis knowledgedistillationinfederatedlearningacomprehensivesurvey
AT jeanfrancoispradatpeyre knowledgedistillationinfederatedlearningacomprehensivesurvey
AT abbassnasser knowledgedistillationinfederatedlearningacomprehensivesurvey