data-parallelism topic
List
data-parallelism repositories
keras_multi_gpu
44
Stars
22
Forks
Watchers
Multi-GPU training for Keras
veloce
18
Stars
0
Forks
Watchers
WIP. Veloce is a low-code Ray-based parallelization library that makes machine learning computation novel, efficient, and heterogeneous.
pipegoose
77
Stars
17
Forks
Watchers
Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*
pytorch-transformer-distributed
24
Stars
10
Forks
Watchers
Distributed training (multi-node) of a Transformer model
sc23-dl-tutorial
36
Stars
7
Forks
Watchers
SC23 Deep Learning at Scale Tutorial Material