Large Language Model and Digital Twins Empowered Asynchronous Federated Learning for Secure Data Sharing in Intelligent Labeling
With the advancement of the large language model (LLM), the demand for data labeling services has increased dramatically. Big models are inseparable from high-quality, specialized scene data, from training to deploying application iterations to landing generation. However, how to achieve intelligent...
Saved in:
| Main Authors: | Xuanzhu Sheng, Chao Yu, Xiaolong Cui, Yang Zhou |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
MDPI AG
2024-11-01
|
| Series: | Mathematics |
| Subjects: | |
| Online Access: | https://www.mdpi.com/2227-7390/12/22/3550 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Asynchronous Federated Learning Through Online Linear Regressions
by: Taiga Kashima, et al.
Published: (2024-01-01) -
Blockchain and signcryption enabled asynchronous federated learning framework in fog computing
by: Zhou Zhou, et al.
Published: (2025-04-01) -
Improved Asynchronous Federated Learning for Data Injection Pollution
by: Aiyou Li, et al.
Published: (2025-05-01) -
Research on asynchronous robust federated learning method in vehicle computing power network
by: YIN Hongbo, et al.
Published: (2024-12-01) -
Asynchronous Quantum-Resistant Blockchain for Secure Intelligence Sharing
by: Yun-Yi Fan, et al.
Published: (2025-05-01)