stable-diffusion icon indicating copy to clipboard operation
stable-diffusion copied to clipboard

A latent text-to-image diffusion model

Results 362 stable-diffusion issues
Sort by recently updated
recently updated
newest added

``` 56 57 # encode the init image into latents and scale the latents ---> 58 init_latents = self.vae.encode(init_image.to(self.device)).sample() 59 init_latents = 0.18215 * init_latents 60 AttributeError: 'AutoencoderKLOutput' object has...

Hello I hope someone can help me understand why the KL is calculated as: 0.5 * torch.sum(torch.pow(self.mean, 2) + self.var - 1.0 - self.logvar, dim=[1, 2, 3]) In the DiagonalGaussianDistribution...

Loading pipeline components...: 100%|██████████| 7/7 [00:02

1. first move `self.logvar` to `self.device`, and then index using `t` (line 1030, ddpm.py) the current verison of the code gives the following error: ``` RuntimeError: indices should be either...

This error was prompted when I ran the code using the instructions provided in readme.I'm hoping to get some help with this problem,thks. ![case](https://user-images.githubusercontent.com/48081128/197160133-93c91080-a37e-4bb9-88ad-c9aa911d0228.jpg)

What are the default/config files corresponding to released checkpoints? For e.g., I want the Stable Diffusion's config (or hyperparameter settings) using which the text-to-image (txt2img-256) was trained. Where can I...

Cat for a djembe

when I use the tokenizer, the error occurs: Info:File "/home/z/anaconda3/envs/t2v/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 3737, in _pad encoded_inputs["attention_mask"] = encoded_inputs["attention_mask"] + [0] * difference !:OverflowError: cannot fit 'int' into an index-sized integer I...

https://github.com/CompVis/stable-diffusion/blob/21f890f9da3cfbeaba8e2ac3c425ee9e998d5229/ldm/modules/attention.py#L99 As I understand, other implementations of attention except for `SpatialSelfAttention` in this module are set with bias=False. Why is it different? Any explanation will be greatly appreciated.