carla-driving-rl-agent icon indicating copy to clipboard operation
carla-driving-rl-agent copied to clipboard

How many GPU memory does the neural network need?

Open xiboli opened this issue 3 years ago • 4 comments

Hello Luca,

I try to run the program in my own computer, which is with GTX 2060 Super GPU. However it seems my 8GB memory is not enough because I always face OOM when allocating tensor with shape[64,464,3,12]. How much memory of GPU is needed?

Best regards Xibo

xiboli avatar Feb 13 '22 23:02 xiboli

Hi, I don't know exactly how much GPU mem is needed but, unfortunately, is plenty.

What you can try is to edit main.py. In particular you could set timesteps=256 and/or batch_size=32 in the learning.stage_sX(...) function calls. You can also reduce the input size, here, or even the #parameters of the agent's neural-net: see here and here too.

Hope it helps

Luca96 avatar Feb 14 '22 09:02 Luca96

Thank you so much. I would try that. With Nvidia-smi I have saw that carla will use 4GB and the network should use more than 4 GB

xiboli avatar Feb 15 '22 10:02 xiboli

@xiboli hi

My Carla can use 4GB. I try

os.environ["CUDA_DEVICE_ORDER"] = "PCI_BUS_ID"
os.environ["CUDA_VISIBLE_DEVICES"] = "0"

but network not using GPU. pls help me

nguyenvantruong1905 avatar Jun 09 '23 03:06 nguyenvantruong1905

@nguyenvantruong1905, check the following:

  • Do you set the environment variables before loading tensorflow, i.e. before import tensorflow as tf?
  • Do you have a CUDA capable GPU? If so, have you correctly installed CUDA, Cudnn, etc? Check the console for error messages.
  • The issue happens only with CARLA, or even with tensorflow alone (to check the latter run a script with only tensorflow and make sure the GPU allocates a tensor)?
  • Have you tried forcing GPU execution? You can do that by wrapping the code about the model with a with tf.device('gpu'): scope.

Luca96 avatar Jun 10 '23 07:06 Luca96