StableCascade
StableCascade copied to clipboard
v100 cannot calculate bf16
Hello, how do I modify the configuration if I want to use it on 32G v100?
Do you mean run.py line 39 error? I checked how automatic1111 .devices.py is structured and I tried:
device = "cuda" dtype = torch.float16 . . . with torch.cuda.amp.autocast(dtype=torch.float16):
and with 32, but results are bad.
we added fp16 support fully working : https://github.com/Stability-AI/StableCascade/issues/125