StableCascade icon indicating copy to clipboard operation
StableCascade copied to clipboard

Quality small vs large model

Open CyberTimon opened this issue 1 year ago • 3 comments

Has anyone already compared the quality of the small models vs the big models? I'm quite interested in the difference. If other people would also like to have more info on this, I can test it and report back here.

CyberTimon avatar Feb 14 '24 10:02 CyberTimon

I can't even get the small models to load currently. I'm using the image-to-image notebook and have never made it past the "Load Extras & Models" cell. I finally got it to download the big models which the script will not run without, just constantly get this error: --> 178 for param_name, param in load_or_fail(self.config.generator_checkpoint_path).items(): 179 set_module_tensor_to_device(generator, param_name, "cpu", value=param) 180 generator = generator.to(dtype).to(self.device)

AttributeError: 'NoneType' object has no attribute 'items' If I don't use the big-big option on model_download.sh and of course when I do use that I go straight to COOM on a T4 colab.

TheOneTrueGuy avatar Feb 17 '24 17:02 TheOneTrueGuy

RTX 3080 (10 gb VRAM) Caption = A green parrot with a knife in its paw threatens a fox

Big-big models:

47 minutes to create Full memory usage, 100% GPU load Output: Parrot_fox_big

Small-small models:

~40 SECONDS to create Full memory usage, 100% GPU load Output: Parrot_fox_small

Looks a little stupid compared to the larger models, but saves a lot of time

Changed captions, for example, "Cinematic photo of an green parrot in the city wearing sunglasses and a black suit" 40 seconds Small-small models output: Parrot_sunglasses

Bottom line: more details and understanding of context in larger models, but generation takes approximately too long with full memory and GPU load

Deeps358 avatar Feb 25 '24 19:02 Deeps358

I can't even get the small models to load currently. I'm using the image-to-image notebook and have never made it past the "Load Extras & Models" cell. I finally got it to download the big models which the script will not run without, just constantly get this error: --> 178 for param_name, param in load_or_fail(self.config.generator_checkpoint_path).items(): 179 set_module_tensor_to_device(generator, param_name, "cpu", value=param) 180 generator = generator.to(dtype).to(self.device)

AttributeError: 'NoneType' object has no attribute 'items' If I don't use the big-big option on model_download.sh and of course when I do use that I go straight to COOM on a T4 colab.

You need to replace model version and generator_checkpoint_path in config for small models.

For small stage B model (stage_b_3b.yaml): model_version: 700M generator_checkpoint_path: models/stage_b_lite_bf16.safetensors

For small stage C model (stage_c_3b.yaml): model_version: 1B generator_checkpoint_path: models/stage_c_lite_bf16.safetensors

Also look at #4

Deeps358 avatar Feb 26 '24 05:02 Deeps358