A snapshot of parallelism in distributed deep learning training
The accelerated development of applications related to artificial intelligence has generated the creation of increasingly complex neural network models with enormous amounts of parameters, currently reaching up to trillions of parameters. Therefore, it makes your training almost impossible without...
Saved in:
| Main Authors: | Hairol Romero-Sandí, Gabriel Núñez, Elvis Rojas |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Universidad Autónoma de Bucaramanga
2024-06-01
|
| Series: | Revista Colombiana de Computación |
| Online Access: | https://revistasunabeduco.biteca.online/index.php/rcc/article/view/5054 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Similar Items
-
A study of pipeline parallelism in deep neural networks
by: Gabriel Núñez, et al.
Published: (2024-06-01) -
A Snapshot of Bayesianism
by: Mark A. Gannon
Published: (2025-04-01) -
A Case Study of Snapshot Replication and Transfer of Data in Distributed Databases
by: Saadi Hamad Thalij, et al.
Published: (2019-01-01) -
Snapshot of an intermediate structure
by: Akira Shinohara
Published: (2025-08-01) -
A family practice snapshot
by: G.A. Ogunbanjo, et al.
Published: (2006-10-01)