A snapshot of parallelism in distributed deep learning training
The accelerated development of applications related to artificial intelligence has generated the creation of increasingly complex neural network models with enormous amounts of parameters, currently reaching up to trillions of parameters. Therefore, it makes your training almost impossible without...
Saved in:
| Main Authors: | , , |
|---|---|
| Format: | Article |
| Language: | English |
| Published: |
Universidad Autónoma de Bucaramanga
2024-06-01
|
| Series: | Revista Colombiana de Computación |
| Online Access: | https://revistasunabeduco.biteca.online/index.php/rcc/article/view/5054 |
| Tags: |
Add Tag
No Tags, Be the first to tag this record!
|