stable-diffusion-webui icon indicating copy to clipboard operation
stable-diffusion-webui copied to clipboard

[Bug]: Grey Output image, thanks to update

Open Blutkvlt opened this issue 1 year ago • 1 comments

Is there an existing issue for this?

  • [X] I have searched the existing issues and checked the recent builds/commits

What happened?

I wanted to test the Controlnet Extension, so i updatet my Automatic1111 per git pull. Also installed controlnet.

After this, every picture is grey. Doesn't make a difference what model I use. Also changed the VAE from "Automatic" to the downloaded vae-ft-mse-840000-ema-pruned.ckpt. nope

Steps to reproduce the problem

  1. Go to ....
  2. Press ....
  3. ...

What should have happened?

I think the update f***** off something in my setup. Also used every type of Command Line Arguments... nope

Version or Commit where the problem happens

python: 3.10.6  •  torch: 1.13.1+rocm5.2  •  xformers: N/A  •  gradio: 3.16.2  •  commit: ea9bd9fc  •  checkpoint: e1441589a6

What Python version are you running on ?

Python 3.10.x

What platforms do you use to access the UI ?

Linux

What device are you running WebUI on?

AMD GPUs (RX 5000 below)

Cross attention optimization

Automatic

What browsers do you use to access the UI ?

Mozilla Firefox

Command Line Arguments

No

List of extensions

ControlNet sd-webui-controlnet stable-diffusion-webui-inspiration LDSR Lora ScuNET SwinIR prompt-backet-checker

Console logs

################################################################
Running on tom user
################################################################

################################################################
Repo already cloned, using it as install directory
################################################################

################################################################
Create and activate python venv
################################################################

################################################################
Launching launch.py...
################################################################
Python 3.10.6 (main, Nov 14 2022, 16:10:14) [GCC 11.3.0]
Commit hash: ea9bd9fc7409109adcd61b897abc2c8881161256
Installing requirements for Web UI

Launching Web UI with arguments: 
No module 'xformers'. Proceeding without it.
2023-06-28 10:53:37,146 - ControlNet - INFO - ControlNet v1.1.227
ControlNet preprocessor location: /home/tom/Desktop/Stable Diffusion/stable-diffusion-webui/extensions/sd-webui-controlnet/annotator/downloads
2023-06-28 10:53:37,188 - ControlNet - INFO - ControlNet v1.1.227
Loading weights [8634d80dec] from /home/tom/Desktop/Stable Diffusion/stable-diffusion-webui/models/Stable-diffusion/liberty_main.safetensors
Creating model from config: /home/tom/Desktop/Stable Diffusion/stable-diffusion-webui/configs/v1-inference.yaml
LatentDiffusion: Running in eps-prediction mode
DiffusionWrapper has 859.52 M params.
Applying cross attention optimization (Doggettx).
Textual inversion embeddings loaded(0): 
Model loaded in 1.8s (load config: 0.1s, create model: 0.3s, apply weights to model: 0.5s, apply half(): 0.4s, load VAE: 0.2s, move model to device: 0.2s).

Thanks for being a Gradio user! If you have questions or feedback, please join our Discord server and chat with us: https://discord.gg/feTf9x3ZSB
Running on local URL:  http://127.0.0.1:7860

To create a public link, set `share=True` in `launch()`.
Loading weights [2182245415] from /home/tom/Desktop/Stable Diffusion/stable-diffusion-webui/models/Stable-diffusion/inkpunkDiffusion_v2.ckpt
Applying cross attention optimization (Doggettx).
Weights loaded in 1.5s (load weights from disk: 1.0s, apply weights to model: 0.3s, move model to device: 0.2s).
  0%|                                                    | 0/20 [00:00<?, ?it/s]MIOpen(HIP): Warning [SQLiteBase] Missing system database file: gfx1030_20.kdb Performance may degrade. Please follow instructions to install: https://github.com/ROCmSoftwarePlatform/MIOpen#installing-miopen-kernels-package
100%|███████████████████████████████████████████| 20/20 [00:25<00:00,  1.28s/it]
Total progress: 100%|███████████████████████████| 20/20 [00:03<00:00,  5.37it/s]

Additional information

No response

Blutkvlt avatar Jun 28 '23 09:06 Blutkvlt

currently webui is on torch==2.0.1+rocm5.4.2 for amd systems. Try with a fresh install, but I recommend keeping the other webui folder around if you want to rollback. AMD cards performance are undocumented so it too hard to keep track of working versions of things

ClashSAN avatar Jul 01 '23 05:07 ClashSAN

5700xt here. All the images are solid brownish grey colour, but the gpu clocks are raised appropriately during generation.

HeadstrongCatgirl avatar Aug 22 '23 03:08 HeadstrongCatgirl

Sorry, i worked on this some time ago but i never saw this issue before. First of all...

currently webui is on torch==2.0.1+rocm5.4.2 for amd systems. Try with a fresh install, but I recommend keeping the other webui folder around if you want to rollback. AMD cards performance are undocumented so it too hard to keep track of working versions of things

For RX5000 cards the situation is pretty wierd, they need pytorch compiled against rocm 5.2 or lower. i made a workaround some time ago for forcing the old 1.13.1 version on those cards, it's intended. and probably that's not the issue

Also, there's an old nightly pytorch 2.0 build wich works with those cards. still old but better than using 1.13.1 open the webui_user.sh file and change the TORCH_COMMAND line to this: export TORCH_COMMAND="pip install https://download.pytorch.org/whl/nightly/rocm5.2/torch-2.0.0.dev20230209%2Brocm5.2-cp310-cp310-linux_x86_64.whl https://download.pytorch.org/whl/nightly/rocm5.2/torchvision-0.15.0.dev20230209%2Brocm5.2-cp310-cp310-linux_x86_64.whl"

Also, because of this https://github.com/ROCm/ROCm/issues/1857 just like some other older AMD cards, you need to use the "--precision full" and "--no-half" flags. you also most likely need "--medvram" or even "--lowvram"

So, open again your webui_user and add export COMMANDLINE_ARGS="--precision full --no-half-vae --medvram"

....Or just add those flags to your existing COMMANDLINE_ARGS line

NOTE: version 1.8 broke my workaround for Navi1 cards, but i made a PR for it https://github.com/AUTOMATIC1111/stable-diffusion-webui/pull/15224 If you change the TORCH_COMMAND line in webui-user as i wrote it should work anyway.

Let me know if it works

DGdev91 avatar Mar 12 '24 00:03 DGdev91