SimSwapHD icon indicating copy to clipboard operation
SimSwapHD copied to clipboard

[Beginner] CUDA out of memory

Open realisticdreamer114514 opened this issue 3 years ago • 3 comments

I'm a beginner who's dissatisfied with the official SimSwap_512 beta model and trying to finetune the official people model for better performance at -image_size 512 --display_winsize 512. When I'm starting my finetuning there's this:

RuntimeError: CUDA out of memory. Tried to allocate 64.00MiB (GPU 0; 4.00 GiB total capacity; 2.28 GiB already allocated; 54.45 MiB free; 2.42 GiB reserved in total by Pytorch)

This happens even at -image_size 224 --display_winsize 224 and I found out it is a relatively common problem. I'm not sure if it's my hardware limitations (GTX 1650 with Max-Q Design) or some underlying issue with the code/my configuration, since I have no other concurrent graphical processes when I ran the command.

Since this is my first major experience with training models and machine learning in general I'll be asking some basic questions as they come up, and I hope anyone following this project will be patient with me. Thanks!

realisticdreamer114514 avatar Dec 30 '21 16:12 realisticdreamer114514

Add --batchSize option to command. It happens because your GTX 1650 is haven't enough memory to pass a large dataset through it. Default value of batchSize is 8. Try lower value, example : CUDA_VISIBLE_DEVICES=0 python train.py --batchSize 4 --name CelebA_512_finetune --which_epoch latest --dataroot ./dataset/CelebA --image_size 512 --display_winsize 512 --continue_train

netrunner-exe avatar Dec 30 '21 20:12 netrunner-exe

I don't have a NVIDIA card so I'm trying on colab and having this error too when finetuning, but it will be generally tedious work for the colab limitations of time too. I wonder if it is possible if someone just trains the dataset once and then releases the end files, are there any copyright issues?

Lyntai avatar Dec 31 '21 00:12 Lyntai

Kept reducing the batch size and the same error keeps happening, with different numbers for RAM allocation & free RAM. Is it something to do with my configuration, or is my GPU underpowered?

realisticdreamer114514 avatar Jan 14 '22 12:01 realisticdreamer114514