gpt-neo-fine-tuning-example icon indicating copy to clipboard operation
gpt-neo-fine-tuning-example copied to clipboard

Fine-Tune EleutherAI GPT-Neo And GPT-J-6B To Generate Netflix Movie Descriptions Using Hugginface And DeepSpeed

Results 2 gpt-neo-fine-tuning-example issues
Sort by recently updated
recently updated
newest added

When replicating the code Deepspeed gets stuck with `[2021-06-29 14:29:44,757] [INFO] [utils.py:13:_initialize_parameter_parallel_groups] data_parallel_size: 1, parameter_parallel_size: 1` Any ideas on how to fix this?

If I want to save and run generation on the model later on, I assume I do something like this: After training: `tokenizer.save_pretrained('./results/')` Later generation: ``` weights = "./results/" tokenizer...