Review of communication optimization methods in federated learning

With the development and popularization of artificial intelligence technologies represented by deep learning, the security issues they continuously expose have become a huge challenge affecting cyberspace security. Traditional cloud-centric distributed machine learning, which trains models or optimi...

Full description

Saved in:
Bibliographic Details
Main Authors: YANG Zhikai, LIU Yaping, ZHANG Shuo, SUN Zhe, YAN Dingyu
Format: Article
Language:English
Published: POSTS&TELECOM PRESS Co., LTD 2024-12-01
Series:网络与信息安全学报
Subjects:
Online Access:http://www.cjnis.com.cn/thesisDetails#10.11959/j.issn.2096-109x.2024077
Tags: Add Tag
No Tags, Be the first to tag this record!
_version_ 1823864847710814208
author YANG Zhikai
LIU Yaping
ZHANG Shuo
SUN Zhe
YAN Dingyu
author_facet YANG Zhikai
LIU Yaping
ZHANG Shuo
SUN Zhe
YAN Dingyu
author_sort YANG Zhikai
collection DOAJ
description With the development and popularization of artificial intelligence technologies represented by deep learning, the security issues they continuously expose have become a huge challenge affecting cyberspace security. Traditional cloud-centric distributed machine learning, which trains models or optimizes model performance by collecting data from participating parties, is susceptible to security attacks and privacy attacks during the data exchange process, leading to consequences such as a decline in overall system efficiency or the leakage of private data. Federated learning, as a distributed machine learning paradigm with privacy protection capabilities, exchanges model parameters through frequent communication between clients and parameter servers, training a joint model without the raw data leaving the local area. This greatly reduces the risk of private data leakage and ensures data security to a certain extent. However, as deep learning models become larger and federated learning tasks more complex, communication overhead also increases, eventually becoming a barrier to the application of federated learning. Therefore, the exploration of communication optimization methods for federated learning has become a hot topic. The technical background and workflow of federated learning were introduced, and the sources and impacts of its communication bottlenecks were analyzed. Then, based on the factors affecting communication efficiency, existing federated learning communication optimization methods were comprehensively sorted out and analyzed from optimization objectives such as model parameter compression, model update strategies, system architecture, and communication protocols. The development trend of this research field was also presented. Finally, the problems faced by existing federated learning communication optimization methods were summarized, and future development trends and research directions were looked forward to.
format Article
id doaj-art-47ae4a4663984b9a8d307ebd8fd300d7
institution Kabale University
issn 2096-109X
language English
publishDate 2024-12-01
publisher POSTS&TELECOM PRESS Co., LTD
record_format Article
series 网络与信息安全学报
spelling doaj-art-47ae4a4663984b9a8d307ebd8fd300d72025-02-08T19:00:09ZengPOSTS&TELECOM PRESS Co., LTD网络与信息安全学报2096-109X2024-12-011012380361674Review of communication optimization methods in federated learningYANG ZhikaiLIU YapingZHANG ShuoSUN ZheYAN DingyuWith the development and popularization of artificial intelligence technologies represented by deep learning, the security issues they continuously expose have become a huge challenge affecting cyberspace security. Traditional cloud-centric distributed machine learning, which trains models or optimizes model performance by collecting data from participating parties, is susceptible to security attacks and privacy attacks during the data exchange process, leading to consequences such as a decline in overall system efficiency or the leakage of private data. Federated learning, as a distributed machine learning paradigm with privacy protection capabilities, exchanges model parameters through frequent communication between clients and parameter servers, training a joint model without the raw data leaving the local area. This greatly reduces the risk of private data leakage and ensures data security to a certain extent. However, as deep learning models become larger and federated learning tasks more complex, communication overhead also increases, eventually becoming a barrier to the application of federated learning. Therefore, the exploration of communication optimization methods for federated learning has become a hot topic. The technical background and workflow of federated learning were introduced, and the sources and impacts of its communication bottlenecks were analyzed. Then, based on the factors affecting communication efficiency, existing federated learning communication optimization methods were comprehensively sorted out and analyzed from optimization objectives such as model parameter compression, model update strategies, system architecture, and communication protocols. The development trend of this research field was also presented. Finally, the problems faced by existing federated learning communication optimization methods were summarized, and future development trends and research directions were looked forward to.http://www.cjnis.com.cn/thesisDetails#10.11959/j.issn.2096-109x.2024077federated learningedge computingcommunication optimizationmodel compression
spellingShingle YANG Zhikai
LIU Yaping
ZHANG Shuo
SUN Zhe
YAN Dingyu
Review of communication optimization methods in federated learning
网络与信息安全学报
federated learning
edge computing
communication optimization
model compression
title Review of communication optimization methods in federated learning
title_full Review of communication optimization methods in federated learning
title_fullStr Review of communication optimization methods in federated learning
title_full_unstemmed Review of communication optimization methods in federated learning
title_short Review of communication optimization methods in federated learning
title_sort review of communication optimization methods in federated learning
topic federated learning
edge computing
communication optimization
model compression
url http://www.cjnis.com.cn/thesisDetails#10.11959/j.issn.2096-109x.2024077
work_keys_str_mv AT yangzhikai reviewofcommunicationoptimizationmethodsinfederatedlearning
AT liuyaping reviewofcommunicationoptimizationmethodsinfederatedlearning
AT zhangshuo reviewofcommunicationoptimizationmethodsinfederatedlearning
AT sunzhe reviewofcommunicationoptimizationmethodsinfederatedlearning
AT yandingyu reviewofcommunicationoptimizationmethodsinfederatedlearning