stable-diffusion-webui
stable-diffusion-webui copied to clipboard
Loading large models will consume too much RAM, causing low RAM devices to crash
Loading large models will consume too much RAM, causing low RAM devices to crash
Card 2060 doesn't work after update( f894dd5 ), with --precision full --no-half --medvram
Card 2060 doesn't work after update( f894dd5 ), with
--precision full --no-half --medvram
RTX 2060 with 6GB vRAM here, try with: --medvram --opt-split-attention --precision autocast
Note: not sure if OP means system RAM or vRAM (in my case I have 48GB of RAM)
Card 2060 doesn't work after update( f894dd5 ), with
--precision full --no-half --medvram
RTX 2060 with 6GB vRAM here, try with: --medvram --opt-split-attention --precision autocast
Note: not sure if OP means system RAM or vRAM (in my case I have 48GB of RAM)
With --medvram --opt-split-attention --precision autocast
I got wrong results.
@LIGHT-Mus try installing xformers