prompt-to-prompt icon indicating copy to clipboard operation
prompt-to-prompt copied to clipboard

TypeError: getattr(): attribute name must be string from "null_text_w_ptp.ipynb" file

Open jakeyahn opened this issue 1 year ago • 6 comments

I am trying to run the jupyter file and third block give me the following error.

scheduler = DDIMScheduler(beta_start=0.00085, beta_end=0.012, beta_schedule="scaled_linear", clip_sample=False, set_alpha_to_one=False) MY_TOKEN = '' LOW_RESOURCE = False NUM_DDIM_STEPS = 50 GUIDANCE_SCALE = 7.5 MAX_NUM_WORDS = 77 device = torch.device('cuda:0') if torch.cuda.is_available() else torch.device('cpu') ldm_stable = StableDiffusionPipeline.from_pretrained("CompVis/stable-diffusion-v1-4", use_auth_token=MY_TOKEN, scheduler=scheduler).to(device) try: ldm_stable.disable_xformers_memory_efficient_attention() except AttributeError: print("Attribute disable_xformers_memory_efficient_attention() is missing") tokenizer = ldm_stable.tokenizer


TypeError Traceback (most recent call last) Cell In[3], line 8 6 MAX_NUM_WORDS = 77 7 device = torch.device('cuda:0') if torch.cuda.is_available() else torch.device('cpu') ----> 8 ldm_stable = StableDiffusionPipeline.from_pretrained("CompVis/stable-diffusion-v1-4", use_auth_token=MY_TOKEN, scheduler=scheduler).to(device) 9 try: 10 ldm_stable.disable_xformers_memory_efficient_attention()

File ~/anaconda3/envs/p2p/lib/python3.8/site-packages/diffusers/pipeline_utils.py:373, in DiffusionPipeline.from_pretrained(cls, pretrained_model_name_or_path, **kwargs) 370 if issubclass(class_obj, class_candidate): 371 load_method_name = importable_classes[class_name][1] --> 373 load_method = getattr(class_obj, load_method_name) 375 loading_kwargs = {} 376 if issubclass(class_obj, torch.nn.Module):

TypeError: getattr(): attribute name must be string

any comments?

All the other jupyter file works well.

  • I also tried to bring Stable-diffusion v-2.1, and it also didn't work :(

jakeyahn avatar Mar 06 '23 09:03 jakeyahn

You can try pip install diffusers==0.10.0 rather than 0.3.0

somuchtome avatar Mar 09 '23 05:03 somuchtome

the problem is solved but it raise a new one

TypeError: PNDMScheduler.set_timesteps() got an unexpected keyword argument 'offset'

how should i fix it

moonnnpie avatar Mar 14 '23 12:03 moonnnpie

It was working fine with diffusers==0.3.0 uptil recently

It's not working now, any potential fix?

zed1025 avatar May 05 '23 04:05 zed1025

the problem is solved but it raise a new one

TypeError: PNDMScheduler.set_timesteps() got an unexpected keyword argument 'offset'

how should i fix it

Yes i got the same issue. neither diffusers 0.3.0 or 0.10.0 could work

lindapu-1 avatar Jul 04 '23 06:07 lindapu-1

the problem is solved but it raise a new one

TypeError: PNDMScheduler.set_timesteps() got an unexpected keyword argument 'offset'

how should i fix it

Just omit line 163 and revise model.scheduler.set_timesteps in ptp_utils.py

    # set timesteps
    # extra_set_kwargs = {"offset": 1}
    # model.scheduler.set_timesteps(num_inference_steps, **extra_set_kwargs)
    model.scheduler.set_timesteps(num_inference_steps)

XuejiFang avatar Jul 13 '23 10:07 XuejiFang

just try this https://github.com/google/prompt-to-prompt/issues/29#issuecomment-1398825757

yuanzhi-zhu avatar Aug 06 '23 02:08 yuanzhi-zhu