Antonin RAFFIN

Results 880 comments of Antonin RAFFIN

Here when optimizing hyperparameters: https://github.com/DLR-RM/rl-baselines3-zoo/blob/75afd65fa4a1f66814777d43bd14e4bba18d96db/utils/exp_manager.py#L632 and there otherwise : https://github.com/DLR-RM/rl-baselines3-zoo/blob/75afd65fa4a1f66814777d43bd14e4bba18d96db/utils/exp_manager.py#L423

Hello, thanks for the proposal =) > and in this case sb3 would be available only within active env. yes, that's what I meant by "everywhere" (more in the sense...

> So in the same directory as import_envs.py, I have created a directory src in which I have put my codes and an Environment.py file. You should create a python...

> Are they fixed hyperparameters that won't be optimized ? are you talking about hyperparameter optimization with Optuna. The hyperparameters in the file are usually the best setting found so...

I would recommend you to take a look at our ICRA tutorial, we had a presentation and notebook about hyperparameter optimization with optuna: - video: https://www.youtube.com/watch?v=AidFTOdGNFQ&list=PL42jkf1t1F7etDiYXWC5Q77yIuVYhXNoy&index=6 - slides: https://araffin.github.io/slides/icra22-hyperparam-opt/ -...

> But I have a problem. As you mentioned "#Sometimes, random hyperparams can generate NaN" under the "Exercise (10 minutes): Define the objective function" section, I got NaN values. Do...

Hello, Please fill the issue template completely.

> Because I use an environment with multiple agents inside, so there are multiple rewards and action returned by the step call every time. Then you must use a `VecEnv`...

> didn't know how to train a VecEnv in zoo, and if it is possible. That's possible but you need a fork of the zoo. We create the env here:...