edm icon indicating copy to clipboard operation
edm copied to clipboard

Distributed Training Error

Open CYYJL opened this issue 3 months ago • 0 comments

Hi,When I try to train with multiple GPUs, I encounter a Rank timeout error. However, if I set --nproc_per_node=1, the code runs normally without any issue. My Environment is : torch 2.3.1,cuda 11.8,python 3.10 ,NCCL 2.18.3

Image

CYYJL avatar Sep 25 '25 12:09 CYYJL