OpenRLHF
OpenRLHF copied to clipboard
The configuration for Llama-7b on 4 RTX4090
Hello, I want to run train_ppo_llama_ray.sh on 4 RTX4090, should I modify the actor_num_gpus_per_node/critic_num_gpus_per_node in train_ppo_llama_ray.sh ? As the default script is for 8 gpus, what else should I pay attention to or should be modified?