stablediffusion icon indicating copy to clipboard operation
stablediffusion copied to clipboard

revision=fp16 doesn't work (RuntimeError: expected scalar type Half but found Float)

Open nguyenmeteorops opened this issue 2 years ago • 2 comments

Hello all,

I tried to setup stable-diffusion-2 and diffusers with revision=fp16, torch_dtype=torch.float16 by the following code, but getting RuntimeError: expected scalar type Half but found Float

from diffusers import StableDiffusionPipeline
pipe = StableDiffusionPipeline.from_pretrained("stabilityai/stable-diffusion-2",
              revision="fp16", 
              torch_dtype=torch.float16 )
pipe = pipe.to("cuda")
prompt = "a photo of an astronaut riding a horse on mars"
image = pipe(prompt, height=768, width=768).images[0]
image.save("astronaut_rides_horse.png")

It works fine without revision=fp16, torch_dtype=torch.float16, does anyone know how to solve this?

Thank you,

nguyenmeteorops avatar Nov 30 '22 18:11 nguyenmeteorops

Full error attached

Traceback (most recent call last):
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/flask/app.py", line 2525, in wsgi_app
    response = self.full_dispatch_request()
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/flask/app.py", line 1822, in full_dispatch_request
    rv = self.handle_user_exception(e)
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/flask/app.py", line 1820, in full_dispatch_request
    rv = self.dispatch_request()
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/flask/app.py", line 1796, in dispatch_request
    return self.ensure_sync(self.view_functions[rule.endpoint])(**view_args)
  File "main.py", line 144, in index
    id, image_list, total_generation_time, seed = generate()
  File "main.py", line 75, in generate
    image_list = generate_image(prompt, negative_prompt, width, height, samples, num_inference_steps, guidance_scale, seed, id)
  File "main.py", line 120, in generate_image
    image = pipe(
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/torch/autograd/grad_mode.py", line 27, in decorate_context
    return func(*args, **kwargs)
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py", line 518, in __call__
    text_embeddings = self._encode_prompt(
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py", line 299, in _encode_prompt
    text_embeddings = self.text_encoder(
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/transformers/models/clip/modeling_clip.py", line 722, in forward
    return self.text_model(
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/transformers/models/clip/modeling_clip.py", line 643, in forward
    encoder_outputs = self.encoder(
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/transformers/models/clip/modeling_clip.py", line 574, in forward
    layer_outputs = encoder_layer(
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/transformers/models/clip/modeling_clip.py", line 317, in forward
    hidden_states, attn_weights = self.self_attn(
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl
    return forward_call(*input, **kwargs)
  File "/home/ubuntu/anaconda3/envs/ldm/lib/python3.8/site-packages/transformers/models/clip/modeling_clip.py", line 257, in forward
    attn_output = torch.bmm(attn_probs, value_states)
RuntimeError: expected scalar type Half but found Float

nguyenmeteorops avatar Nov 30 '22 18:11 nguyenmeteorops

same error

lizhiustc avatar Dec 03 '22 16:12 lizhiustc

I had the same error, but after I updated the transformers library to the latest version, the error went away. Please try installing transformers 4.25.1.

shnhrtkyk avatar Dec 05 '22 05:12 shnhrtkyk

I had the same error, but after I updated the transformers library to the latest version, the error went away. Please try installing transformers 4.25.1. This works to me

nguyenmeteorops avatar Dec 05 '22 05:12 nguyenmeteorops