rl-baselines3-zoo
rl-baselines3-zoo copied to clipboard
[Bug]: Custom environment not found in gym registry, you maybe meant... error message
🐛 Bug
I am encountering an issue when trying to train my donkeycar simulator agent using the train.py script from rl-baselines3-zoo. While I can successfully import and call the environment using gym.make('donkey-mountain-track-v0') in an IPython session (confirming the environment's availability), attempting to use the same environment within rl-baselines3-zoo results in an error indicating that the environment is not found in the gym registry.
This situation is puzzling for several reasons, given that I have taken the following steps to ensure compatibility and correct setup:
- Confirmed gym_donkeycar Import: I verified that gym_donkeycar is imported correctly in import_envs.py within the rl-baselines3-zoo framework. This should theoretically ensure that rl-baselines3-zoo recognizes the custom environment.
- Updated Hyperparameters File: I updated the hyperparameters file within rl-baselines3-zoo to reflect the correct environment name, donkey-mountain-track-v0. This ensures that the training script should look for the correct environment ID when initializing the training process.
- I also followed [https://www.youtube.com/watch?v=ngK33h00iBE&t=2038s] this tutorial step by step and reproduced everything identically.
Despite these measures, the issue persists. Further debugging showed that although train.py successfully imports gym_donkeycar, but when I print out the list of environments currently registered in the gym registry immediately before the error occurs, the donkeycar environments are not listed.
I am looking for insights into why the environment registration for donkey-mountain-track-v0 might not be persisting or being recognized in the context of rl-baselines3-zoo's train.py execution, despite seemingly successful import and registration steps. Your help would be invaluable.
To Reproduce
- Install gym-donkeycar and configure as per documentation.
- Install rlzoo3, ensure gym_donkeycar custom env is imported in through import_envs.py, change the tqc3 hyperparam file to reflect the correct environment (with the environment set to donkey-mountain-track-v0).
- Run train.py
- Encounter the registry error.
Relevant log output / Error message
Traceback (most recent call last):
File "/mnt/c/rl/donkey/rl-baselines3-zoo/train.py", line 4, in <module>
train()
File "/mnt/c/rl/donkey/rl-baselines3-zoo/rl_zoo3/train.py", line 176, in train
raise ValueError(f"{env_id} not found in gym registry, you maybe meant {closest_match}?")
ValueError: donkey-mountain-track-v0 not found in gym registry, you maybe meant DemonAttack-v0?
System Info
- Python: 3.9.16
- Stable-Baselines3: 2.3.0
- PyTorch: 2.2.2
- GPU Enabled: False
- Numpy: 1.26.4
- Cloudpickle: 3.0.0
- Gymnasium: 0.29.1
- OpenAI Gym: 0.22.0
Checklist
- [X] I have checked that there is no similar issue in the repo
- [X] I have read the SB3 documentation
- [X] I have read the RL Zoo documentation
- [X] I have provided a minimal and working example to reproduce the bug
- [X] I've used the markdown code blocks for both code and stack traces.
gym is no longer supported since https://github.com/DLR-RM/rl-baselines3-zoo/pull/403
so i would recommend you to update the env to gymnasium and do a pull request to the gym donkey car repo so it would benefit everyone.
Another alternative is to use an old version of everything but this will bring you only more trouble.