PPO
PPO copied to clipboard
PPO implementation for OpenAI gym environment based on Unity ML Agents
I am trying to run a trained model from the working_model directory. I set the load_model = True and model_path = ./working_model/models But it doesn't read the working model. Am...
Traceback (most recent call last): File "/home/sajan/gym/gym/envs/registration.py", line 159, in spec return self.env_specs[id] KeyError: 'RocketLander-v0' During handling of the above exception, another exception occurred: Traceback (most recent call last): File...
I am looking into your code (which is pretty clean and clear by the way) and have a question for a line of code. In the file PPO/ppo/model.py, line 185...