gpt-neo-fine-tuning-example
gpt-neo-fine-tuning-example copied to clipboard
Deepspeed stuck
When replicating the code Deepspeed gets stuck with
[2021-06-29 14:29:44,757] [INFO] [utils.py:13:_initialize_parameter_parallel_groups] data_parallel_size: 1, parameter_parallel_size: 1
Any ideas on how to fix this?