CUDA out of memory
Hey,
I'm having trouble training my model. I always have the same error saying that Cuda used all my GPU's memory. I tried using this program for a project with very little knowledge when in comes to programming and all, and i can feel how basic my question is gonna be but: my gpu has 4 go of dedicated memory (idek if it's close to what's required here) and it has 7.7 go of shared memory; when full, why can't RVC use this shared memory to keep running?
Thank u in advance :)
Save issue : torch.cuda.OutOfMemoryError: CUDA out of memory. Tried to allocate 20.00 MiB (GPU 0; 4.00 GiB total capacity; 3.44 GiB already allocated; 0 bytes free; 3.48 GiB reserved in total by PyTorch) If reserved memory is >> allocated memory try setting max_split_size_mb to avoid fragmentation. See documentation for Memory Management and PYTORCH_CUDA_ALLOC_CONF
If you GPU memory is 4G~6G, you should decrease batch size If you GPU memory is <4G, you may not be able to use GPU to train the model.