Time Complexity of Training DNNs With Parallel Computing for Wireless Communications
Deep neural networks (DNNs) have been widely used for learning various wireless communication policies. While DNNs have demonstrated the ability to reduce the time complexity of inference, their training often incurs a high computational cost. Since practical wireless systems require retraining due...
Saved in:
| Main Authors: | Pengyu Cong, Chenyang Yang, Shengqian Han, Shuangfeng Han, Xiaoyun Wang |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
IEEE
2025-01-01
|
| Series: | IEEE Open Journal of Vehicular Technology |
| Subjects: | |
| Online Access: | https://ieeexplore.ieee.org/document/10830510/ |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Learning End-to-End Hybrid Precoding for Multi-User mmWave Mobile System With GNNs
by: Ruiming Wang, et al.
Published: (2024-01-01) -
A Hierarchical Dispatcher for Scheduling Multiple Deep Neural Networks (DNNs) on Edge Devices
by: Hyung Kook Jun, et al.
Published: (2025-04-01) -
Parallel-mode EPR spectra of the hexaaqua manganese(II) Ion in tetrahedral symmetry
by: Juel Henrichsen, Margrete, et al.
Published: (2024-02-01) -
SpanTrain: a cross-domain distributed model training system for cloud-edge-end heterogeneous devices
by: WANG Jinquan, et al.
Published: (2025-05-01) -
The social conditions of possibility of “flop”: production, intermediation and reception
by: Isabelle Mayaud, et al.
Published: (2022-06-01)