stable-diffusion-webui icon indicating copy to clipboard operation
stable-diffusion-webui copied to clipboard

colab cant use gpu

Open zxc1844 opened this issue 3 years ago • 1 comments

Describe the bug sd_models is changed , we can't use gpu on colab by remove this parameter ‘map location =cpu’ from the function ’torch.load‘ to load big model ,or the ram will full

zxc1844 avatar Oct 15 '22 02:10 zxc1844

I may have misunderstood the problem, but try to set --lowram command line option #2407

AtaraxiaSjel avatar Oct 15 '22 22:10 AtaraxiaSjel

I'm experiencing the same issue. If you load a large model on Colab, the RAM usage will continue to increase until it reaches 12GB, at which point Colab will terminate any running code.

dotRelith avatar Jan 04 '23 07:01 dotRelith

I'm experiencing the same issue. If you load a large model on Colab, the RAM usage will continue to increase until it reaches 12GB, at which point Colab will terminate any running code.

Try to set --lowram command line option

Ljzd-PRO avatar Jan 04 '23 11:01 Ljzd-PRO

I'm experiencing the same issue. If you load a large model on Colab, the RAM usage will continue to increase until it reaches 12GB, at which point Colab will terminate any running code.

Try to set --lowram command line option

i did that already, forgot to add srry

dotRelith avatar Jan 04 '23 12:01 dotRelith

i got it to load by adding these lines on sd_models: image

dotRelith avatar Jan 04 '23 14:01 dotRelith

If needed launch with --lowram, it loads the full model with more vram, but less CPU ram

ClashSAN avatar May 02 '23 17:05 ClashSAN