dynibar
dynibar copied to clipboard
Question about how large GPU graphics memory can run ?
Thanks for your great work! I am unable to run code on two Telsa V100 with 32G graphics memory. May I know what size of GPU is required?
RuntimeError: CUDA out of memory. Tried to allocate 672.00 MiB (GPU 1; 31.75 GiB total capacity; 27.65 GiB already allocated; 90.00 MiB free; 30.08 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF