Strategy-Switch: From All-Reduce to Parameter Server for Faster Efficient Training
Deep learning plays a pivotal role in numerous big data applications by enhancing the accuracy of models. However, the abundance of available data presents a challenge when training neural networks on a single node. Consequently, various distributed training methods have emerged. Among these, two pr...
Saved in:
Main Authors: | Nikodimos Provatas, Iasonas Chalas, Ioannis Konstantinou, Nectarios Koziris |
---|---|
Format: | Article |
Language: | English |
Published: |
IEEE
2025-01-01
|
Series: | IEEE Access |
Subjects: | |
Online Access: | https://ieeexplore.ieee.org/document/10836684/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
Windows Server 2019 administration fundamentals : a beginner's guide to managing and administering windows server environments /
by: Dauti, Bekim
Published: (2019) -
Exploring scholarly perceptions of preprint servers
by: Shir Aviv-Reuven, et al.
Published: (2024-06-01) -
Microsoft SQL server 2008 administration for Oracle DBAs /
by: Anderson, Mark
Published: (2011) -
Improving reducibility of iron ore pellets by optimization of physical parameters
by: Pal J., et al.
Published: (2017-01-01) -
A pertinent approach: Two class of uncertain priority queuing models under the steady state condition
by: Pamučar Dragan, et al.
Published: (2024-01-01)