carla-driving-rl-agent
carla-driving-rl-agent copied to clipboard
How many GPU memory does the neural network need?
Hello Luca,
I try to run the program in my own computer, which is with GTX 2060 Super GPU. However it seems my 8GB memory is not enough because I always face OOM when allocating tensor with shape[64,464,3,12]. How much memory of GPU is needed?
Best regards Xibo
Hi, I don't know exactly how much GPU mem is needed but, unfortunately, is plenty.
What you can try is to edit main.py. In particular you could set timesteps=256
and/or batch_size=32
in the learning.stage_sX(...)
function calls. You can also reduce the input size, here, or even the #parameters of the agent's neural-net: see here and here too.
Hope it helps
Thank you so much. I would try that. With Nvidia-smi I have saw that carla will use 4GB and the network should use more than 4 GB
@xiboli hi
My Carla can use 4GB. I try
os.environ["CUDA_DEVICE_ORDER"] = "PCI_BUS_ID"
os.environ["CUDA_VISIBLE_DEVICES"] = "0"
but network not using GPU. pls help me
@nguyenvantruong1905, check the following:
- Do you set the environment variables before loading tensorflow, i.e. before
import tensorflow as tf
? - Do you have a CUDA capable GPU? If so, have you correctly installed CUDA, Cudnn, etc? Check the console for error messages.
- The issue happens only with CARLA, or even with tensorflow alone (to check the latter run a script with only tensorflow and make sure the GPU allocates a tensor)?
- Have you tried forcing GPU execution? You can do that by wrapping the code about the model with a
with tf.device('gpu'):
scope.