A Framework Integrating Federated Learning and Fog Computing Based on Client Sampling and Dynamic Thresholding Techniques

The exponential growth in the number of Internet of Things (IoT) devices and the vast quantity of data they generate present a significant challenge to the efficacy of traditional centralized training models. Federated Learning (FL) is a machine learning framework that effectively addresses this iss...

Full description

Saved in:
Bibliographic Details
Main Authors: Dang van Thang, Artem Volkov, Ammar Muthanna, Ibrahim A. Elgendy, Reem Alkanhel, Dushantha Nalin K. Jayakody, Andrey Koucheryavy
Format: Article
Language:English
Published: IEEE 2025-01-01
Series:IEEE Access
Subjects:
Online Access:https://ieeexplore.ieee.org/document/11007538/
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1850232435766198272
author Dang van Thang
Artem Volkov
Ammar Muthanna
Ibrahim A. Elgendy
Reem Alkanhel
Dushantha Nalin K. Jayakody
Andrey Koucheryavy
author_facet Dang van Thang
Artem Volkov
Ammar Muthanna
Ibrahim A. Elgendy
Reem Alkanhel
Dushantha Nalin K. Jayakody
Andrey Koucheryavy
author_sort Dang van Thang
collection DOAJ
description The exponential growth in the number of Internet of Things (IoT) devices and the vast quantity of data they generate present a significant challenge to the efficacy of traditional centralized training models. Federated Learning (FL) is a machine learning framework that effectively addresses this issue and other concerns about data privacy. Furthermore, fog computing represents a robust distributed computing methodology with the potential to bolster and propel the advancement of FL. An integrated distributed architecture combining FL and fog computing (FC) has the potential to overcome the limitations of traditional centralized architectures, offering a promising solution for the future. One of the objectives of implementing this novel architectural framework is to alleviate the burden on communication links within the core network by training a model on distributed training data across many clients. Various techniques and frameworks have been developed and implemented, including approaches to model compression and those addressing data and device heterogeneity. These have demonstrated effectiveness in specific contexts. In this paper, we introduce a novel gradient-driven client-sampling framework that tightly couples Federated Learning with Fog Computing. By dynamically adjusting per-round thresholds based on local gradient change rates, our method selects only the most informative clients and leverages fog nodes for partial aggregation, thereby minimizing redundant transmissions, accelerating convergence under heterogeneous data, and offloading the central server. Extensive simulations on MNIST and CIFAR-10 demonstrate that our approach reduces cumulative communication by 39% and 31%, respectively, without sacrificing convergence speed or final accuracy.
format Article
id doaj-art-495d4fa9e1464be2aa4c60ed6b9cf8b7
institution OA Journals
issn 2169-3536
language English
publishDate 2025-01-01
publisher IEEE
record_format Article
series IEEE Access
spelling doaj-art-495d4fa9e1464be2aa4c60ed6b9cf8b72025-08-20T02:03:13ZengIEEEIEEE Access2169-35362025-01-0113950199503310.1109/ACCESS.2025.357197911007538A Framework Integrating Federated Learning and Fog Computing Based on Client Sampling and Dynamic Thresholding TechniquesDang van Thang0https://orcid.org/0009-0009-2219-3767Artem Volkov1https://orcid.org/0009-0002-4296-1822Ammar Muthanna2Ibrahim A. Elgendy3https://orcid.org/0000-0001-7154-2307Reem Alkanhel4https://orcid.org/0000-0001-6395-4723Dushantha Nalin K. Jayakody5Andrey Koucheryavy6Department of Telecommunication Networks and Data Transmission, The Bonch-Bruevich Saint-Petersburg State University of Telecommunications, Saint Petersburg, RussiaDepartment of Telecommunication Networks and Data Transmission, The Bonch-Bruevich Saint-Petersburg State University of Telecommunications, Saint Petersburg, RussiaDepartment of Telecommunication Networks and Data Transmission, The Bonch-Bruevich Saint-Petersburg State University of Telecommunications, Saint Petersburg, RussiaIRC for Finance and Digital Economy, KFUPM Business School, King Fahd University of Petroleum and Minerals, Dhahran, Saudi ArabiaDepartment of Information Technology, College of Computer and Information Sciences, Princess Nourah bint Abdulrahman University, P.O. Box 84428, Riyadh, Saudi ArabiaCOPELABS, Universidade Lusófona, Lisbon, PortugalDepartment of Telecommunication Networks and Data Transmission, The Bonch-Bruevich Saint-Petersburg State University of Telecommunications, Saint Petersburg, RussiaThe exponential growth in the number of Internet of Things (IoT) devices and the vast quantity of data they generate present a significant challenge to the efficacy of traditional centralized training models. Federated Learning (FL) is a machine learning framework that effectively addresses this issue and other concerns about data privacy. Furthermore, fog computing represents a robust distributed computing methodology with the potential to bolster and propel the advancement of FL. An integrated distributed architecture combining FL and fog computing (FC) has the potential to overcome the limitations of traditional centralized architectures, offering a promising solution for the future. One of the objectives of implementing this novel architectural framework is to alleviate the burden on communication links within the core network by training a model on distributed training data across many clients. Various techniques and frameworks have been developed and implemented, including approaches to model compression and those addressing data and device heterogeneity. These have demonstrated effectiveness in specific contexts. In this paper, we introduce a novel gradient-driven client-sampling framework that tightly couples Federated Learning with Fog Computing. By dynamically adjusting per-round thresholds based on local gradient change rates, our method selects only the most informative clients and leverages fog nodes for partial aggregation, thereby minimizing redundant transmissions, accelerating convergence under heterogeneous data, and offloading the central server. Extensive simulations on MNIST and CIFAR-10 demonstrate that our approach reduces cumulative communication by 39% and 31%, respectively, without sacrificing convergence speed or final accuracy.https://ieeexplore.ieee.org/document/11007538/Federated learningfog computingclient samplingdynamic thresholding
spellingShingle Dang van Thang
Artem Volkov
Ammar Muthanna
Ibrahim A. Elgendy
Reem Alkanhel
Dushantha Nalin K. Jayakody
Andrey Koucheryavy
A Framework Integrating Federated Learning and Fog Computing Based on Client Sampling and Dynamic Thresholding Techniques
IEEE Access
Federated learning
fog computing
client sampling
dynamic thresholding
title A Framework Integrating Federated Learning and Fog Computing Based on Client Sampling and Dynamic Thresholding Techniques
title_full A Framework Integrating Federated Learning and Fog Computing Based on Client Sampling and Dynamic Thresholding Techniques
title_fullStr A Framework Integrating Federated Learning and Fog Computing Based on Client Sampling and Dynamic Thresholding Techniques
title_full_unstemmed A Framework Integrating Federated Learning and Fog Computing Based on Client Sampling and Dynamic Thresholding Techniques
title_short A Framework Integrating Federated Learning and Fog Computing Based on Client Sampling and Dynamic Thresholding Techniques
title_sort framework integrating federated learning and fog computing based on client sampling and dynamic thresholding techniques
topic Federated learning
fog computing
client sampling
dynamic thresholding
url https://ieeexplore.ieee.org/document/11007538/
work_keys_str_mv AT dangvanthang aframeworkintegratingfederatedlearningandfogcomputingbasedonclientsamplinganddynamicthresholdingtechniques
AT artemvolkov aframeworkintegratingfederatedlearningandfogcomputingbasedonclientsamplinganddynamicthresholdingtechniques
AT ammarmuthanna aframeworkintegratingfederatedlearningandfogcomputingbasedonclientsamplinganddynamicthresholdingtechniques
AT ibrahimaelgendy aframeworkintegratingfederatedlearningandfogcomputingbasedonclientsamplinganddynamicthresholdingtechniques
AT reemalkanhel aframeworkintegratingfederatedlearningandfogcomputingbasedonclientsamplinganddynamicthresholdingtechniques
AT dushanthanalinkjayakody aframeworkintegratingfederatedlearningandfogcomputingbasedonclientsamplinganddynamicthresholdingtechniques
AT andreykoucheryavy aframeworkintegratingfederatedlearningandfogcomputingbasedonclientsamplinganddynamicthresholdingtechniques
AT dangvanthang frameworkintegratingfederatedlearningandfogcomputingbasedonclientsamplinganddynamicthresholdingtechniques
AT artemvolkov frameworkintegratingfederatedlearningandfogcomputingbasedonclientsamplinganddynamicthresholdingtechniques
AT ammarmuthanna frameworkintegratingfederatedlearningandfogcomputingbasedonclientsamplinganddynamicthresholdingtechniques
AT ibrahimaelgendy frameworkintegratingfederatedlearningandfogcomputingbasedonclientsamplinganddynamicthresholdingtechniques
AT reemalkanhel frameworkintegratingfederatedlearningandfogcomputingbasedonclientsamplinganddynamicthresholdingtechniques
AT dushanthanalinkjayakody frameworkintegratingfederatedlearningandfogcomputingbasedonclientsamplinganddynamicthresholdingtechniques
AT andreykoucheryavy frameworkintegratingfederatedlearningandfogcomputingbasedonclientsamplinganddynamicthresholdingtechniques