prompt-to-prompt
prompt-to-prompt copied to clipboard
Hi, I'm creating a comparison between prompt to prompt, null-text Inversion and other editing approaches using images present in their respective papers. Could you please share them(haykpoghos[at]gmail[dot]com), or make them...
Hi, Thanks for this wonderful work! I have a question about the equation of deterministic DDIM sampling in the Null-text Inversion paper.  Based on my understanding, deterministic DDIM sampling...
Hi, this question is about the linear projections l_Q, l_K, l_V of the attention module in the paper Prompt-to-Prompt. The paper illustrated that the linear projections are learnable. However, in...
Hi, I am launching your notebook `prompt-to-prompt_stable.ipynb`. I would like to be able to load the weights of a fine-tuned diffusion model into the pipeline. A reproducible section of the...
Thanks for your excellent work! While digging into the code of Null-Text Inversion, I found something confusing. Firstly, according to the formula in your paper, the DDIM Inversion writes like...
I am trying to run the jupyter file and third block give me the following error. scheduler = DDIMScheduler(beta_start=0.00085, beta_end=0.012, beta_schedule="scaled_linear", clip_sample=False, set_alpha_to_one=False) MY_TOKEN = '' LOW_RESOURCE = False NUM_DDIM_STEPS...
This is a really great work, thanks for open sourcing. Currently I am trying to change the pipeline to support img2img task. Then edit the resulting image. but failed by...
The model scheduler looks like: model scheduler : PNDMScheduler { "_class_name": "PNDMScheduler", "_diffusers_version": "0.8.0", "beta_end": 0.012, "beta_schedule": "scaled_linear", "beta_start": 0.00085, "clip_sample": false, "num_train_timesteps": 1000, "set_alpha_to_one": false, "skip_prk_steps": true, "steps_offset": 1,...
Hi,I tried to make the standing cat sit down, but nothing changed. I hope to receive help. Thank you very much
Hi, I found that attention map swapping is performed after the softmax operation. In that case, the sum of those similarities could not be equal to 1. I wonder if...