cog-consistent-character
cog-consistent-character copied to clipboard
12 GB of VRAM, not enough?
I am trying to run the model on Docker (Docker Desktop, Windows via WSL2) and my card is a RTX 4070 12GB, but I always see the error "torch.cuda.OutOfMemoryError: Allocation on device" and the predictions although they say "suceeeded", there are no output files.
I am guessing that the minimum for this model is 16 GB of VRAM?