SafetyRL_HighwayEnv
SafetyRL_HighwayEnv copied to clipboard
RuntimeError
Hi, I am a computer engineer. I run your code to my season project. But ı have a error. Could you help me please ?
(CPO) baykarai@baykarai:~/Documents/Emre/SafetyRL_HighwayEnv/scripts$ python3 ./experiments.py evaluate ./configs/HighwayEnv/env_test.json ./configs/HighwayEnv/agents/DQNAgent/ego_attention.json --test --recover-from ./out/HighwayEnv/DQNAgent/saved_models/latest.tar
INFO: Making new env: highway-v0
/home/baykarai/.local/lib/python3.6/site-packages/torch/cuda/__init__.py:52: UserWarning: CUDA initialization: The NVIDIA driver on your system is too old (found version 9010). Please update your GPU driver by downloading and installing a new version from the URL: http://www.nvidia.com/Download/index.aspx Alternatively, go to: https://pytorch.org to install a PyTorch version that has been compiled with your version of the CUDA driver. (Triggered internally at /pytorch/c10/cuda/CUDAFunctions.cpp:100.)
return torch._C._cuda_getDeviceCount() > 0
[WARNING] Preferred device cuda:best unavailable, switching to default cpu
INFO: Creating monitor directory out/HighwayEnv/DQNAgent/run_20210616-205350_9532
mat1 and mat2 shapes cannot be multiplied (1x5 and 7x64)
Error occurs, No graph saved
Traceback (most recent call last):
File "./experiments.py", line 154, in <module>
main()
File "./experiments.py", line 45, in main
evaluate(opts['<environment>'], opts['<agent>'], opts)
File "./experiments.py", line 78, in evaluate
display_rewards=not options['--no-display'])
File "/home/baykarai/Documents/Emre/SafetyRL_HighwayEnv/rl_agents/trainer/evaluation.py", line 79, in __init__
self.agent.set_writer(self.writer)
File "/home/baykarai/Documents/Emre/SafetyRL_HighwayEnv/rl_agents/agents/deep_q_network/pytorch.py", line 103, in set_writer
self.writer.add_graph(self.value_net, input_to_model=(model_input,)),
File "/home/baykarai/miniconda3/envs/CPO/lib/python3.6/site-packages/tensorboardX/writer.py", line 901, in add_graph
self._get_file_writer().add_graph(graph(model, input_to_model, verbose))
File "/home/baykarai/.local/lib/python3.6/site-packages/torch/utils/tensorboard/_pytorch_graph.py", line 292, in graph
raise e
File "/home/baykarai/.local/lib/python3.6/site-packages/torch/utils/tensorboard/_pytorch_graph.py", line 286, in graph
trace = torch.jit.trace(model, args)
File "/home/baykarai/.local/lib/python3.6/site-packages/torch/jit/_trace.py", line 742, in trace
_module_class,
File "/home/baykarai/.local/lib/python3.6/site-packages/torch/jit/_trace.py", line 940, in trace_module
_force_outplace,
File "/home/baykarai/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 725, in _call_impl
result = self._slow_forward(*input, **kwargs)
File "/home/baykarai/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 709, in _slow_forward
result = self.forward(*input, **kwargs)
File "/home/baykarai/Documents/Emre/SafetyRL_HighwayEnv/rl_agents/agents/common/models.py", line 291, in forward
ego_embedded_att, _ = self.forward_attention(x)
File "/home/baykarai/Documents/Emre/SafetyRL_HighwayEnv/rl_agents/agents/common/models.py", line 304, in forward_attention
ego, others = self.ego_embedding(ego), self.others_embedding(others)
File "/home/baykarai/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 725, in _call_impl
result = self._slow_forward(*input, **kwargs)
File "/home/baykarai/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 709, in _slow_forward
result = self.forward(*input, **kwargs)
File "/home/baykarai/Documents/Emre/SafetyRL_HighwayEnv/rl_agents/agents/common/models.py", line 73, in forward
x = self.activation(layer(x))
File "/home/baykarai/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 725, in _call_impl
result = self._slow_forward(*input, **kwargs)
File "/home/baykarai/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 709, in _slow_forward
result = self.forward(*input, **kwargs)
File "/home/baykarai/.local/lib/python3.6/site-packages/torch/nn/modules/linear.py", line 93, in forward
return F.linear(input, self.weight, self.bias)
File "/home/baykarai/.local/lib/python3.6/site-packages/torch/nn/functional.py", line 1692, in linear
output = input.matmul(weight.t())
RuntimeError: mat1 and mat2 shapes cannot be multiplied (1x5 and 7x64)
i met the same error.
i met the same error.