taming-transformers icon indicating copy to clipboard operation
taming-transformers copied to clipboard

pytorch still raised "out of memory" but my PYTORCH_CUDA_ALLOC_CONF = "max_split_size_mb:128"

Open uranity opened this issue 2 years ago • 1 comments

Do I see properly that my Nvidia free memory is 451MiB? If true than why pytorch still raise Exception "out of memory"?

torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 1.39 GiB (GPU 0; 6.00 GiB total capacity; 4.04 GiB already allocated; 478.00 MiB free; 4.15 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF (venv) PS C:\projects\imageai\venv\stable-diffusion> nvidia-smi Mon Dec 19 13:05:20 2022

+-----------------------------------------------------------------------------+
| NVIDIA-SMI 527.41       Driver Version: 527.41       CUDA Version: 12.0     |
|-------------------------------+----------------------+----------------------+
| GPU  Name            TCC/WDDM | Bus-Id        Disp.A | Volatile Uncorr. ECC |
| Fan  Temp  Perf  Pwr:Usage/Cap|         Memory-Usage | GPU-Util  Compute M. |
|                               |                      |               MIG M. |
|===============================+======================+======================|
|   0  NVIDIA GeForce ... WDDM  | 00000000:01:00.0  On |                  N/A |
| N/A   53C    P8     8W /  N/A |    451MiB /  6144MiB |      6%      Default |
|                               |                      |                  N/A |
+-------------------------------+----------------------+----------------------+

uranity avatar Dec 19 '22 12:12 uranity

I don't know where you see 451 MiB. It looks like it says 478 MiB to me. Either way you only have less than 500 MiB free and you're trying to allocate 1.39 GiB to your device. The GPU isn't large enough to handle it.

bryant0918 avatar Jun 05 '23 22:06 bryant0918