Results 3 comments of 挨踢小茶

> @Narsil would something like this work now? > > ``` > model = StableDiffusionPipeline.from_pretrained("runwayml/stable-diffusion-v1-5", torch_dtype=torch.float16) > model.scheduler = DPMSolverMultistepScheduler.from_config(model.scheduler.config) > model.unet.load_attn_procs(model_path, use_safetensors=True) ###model_path = 'xxx.safetensors' > ``` I tried...

I used 8xA100 with same settings and this message would gone..