Mr FARHYN
Results
1
comments of
Mr FARHYN
Just initialized with `torch.distributed.init_process_group("gloo") ` go to the generation.py file and find the following line ``` if not torch.distributed.is_initialized(): if device == "cuda": torch.distributed.init_process_group("nccl") else: torch.distributed.init_process_group("gloo") ``` change it to...