legged_gym icon indicating copy to clipboard operation
legged_gym copied to clipboard

Configuration files and hyperparameter tuning

Open indweller opened this issue 1 year ago • 1 comments

I see that you have used Python classes for config files. Is there any reason you choose Python classes over YAML files?

Also, given that you used Python classes, how did you perform the grid search on the parameters? I found that the nested class structure makes it messier to iterate over and get the attributes of the parameters that I want to search over. If you have the code doing the grid search, can you please share that?

indweller avatar Oct 25 '23 17:10 indweller

You can do a grid search by running the training in a for loop and changing the configs for each iteration. The only caveat is that you need to run each training in s separate process to ensure proper closing/reset of the simulator. You will need the following: ` from torch.multiprocessing import Process, set_start_method

try: set_start_method("spawn") except RuntimeError as e: print(e) .... def train(args: argparse.Namespace, env_cfg: BaseConfig, train_cfg: BaseConfig): .... ppo_runner.learn()

def train_batch(args: argparse.Namespace): for i in range(5): # hyperparams to run over seed = 23 * i + 17 train_cfg.seed = seed env_cfg.seed = seed # launch process p = Process(target=train, args=(args, env_cfg, train_cfg)) p.start() p.join() p.kill() print(f">>> Run {i} done!") `

Alternatively you can use some external tool and go through yaml files using the update_class_from_dict and class_to_dict functions.

nikitardn avatar Nov 01 '23 07:11 nikitardn