StableCascade
StableCascade copied to clipboard
Quality small vs large model
Has anyone already compared the quality of the small models vs the big models? I'm quite interested in the difference. If other people would also like to have more info on this, I can test it and report back here.
I can't even get the small models to load currently. I'm using the image-to-image notebook and have never made it past the "Load Extras & Models" cell. I finally got it to download the big models which the script will not run without, just constantly get this error: --> 178 for param_name, param in load_or_fail(self.config.generator_checkpoint_path).items(): 179 set_module_tensor_to_device(generator, param_name, "cpu", value=param) 180 generator = generator.to(dtype).to(self.device)
AttributeError: 'NoneType' object has no attribute 'items' If I don't use the big-big option on model_download.sh and of course when I do use that I go straight to COOM on a T4 colab.
RTX 3080 (10 gb VRAM) Caption = A green parrot with a knife in its paw threatens a fox
Big-big models:
47 minutes to create
Full memory usage, 100% GPU load
Output:
Small-small models:
~40 SECONDS to create
Full memory usage, 100% GPU load
Output:
Looks a little stupid compared to the larger models, but saves a lot of time
Changed captions, for example, "Cinematic photo of an green parrot in the city wearing sunglasses and a black suit"
40 seconds
Small-small models
output:
Bottom line: more details and understanding of context in larger models, but generation takes approximately too long with full memory and GPU load
I can't even get the small models to load currently. I'm using the image-to-image notebook and have never made it past the "Load Extras & Models" cell. I finally got it to download the big models which the script will not run without, just constantly get this error: --> 178 for param_name, param in load_or_fail(self.config.generator_checkpoint_path).items(): 179 set_module_tensor_to_device(generator, param_name, "cpu", value=param) 180 generator = generator.to(dtype).to(self.device)
AttributeError: 'NoneType' object has no attribute 'items' If I don't use the big-big option on model_download.sh and of course when I do use that I go straight to COOM on a T4 colab.
You need to replace model version and generator_checkpoint_path in config for small models.
For small stage B model (stage_b_3b.yaml): model_version: 700M generator_checkpoint_path: models/stage_b_lite_bf16.safetensors
For small stage C model (stage_c_3b.yaml): model_version: 1B generator_checkpoint_path: models/stage_c_lite_bf16.safetensors
Also look at #4