stable-diffusion-webui-docker icon indicating copy to clipboard operation
stable-diffusion-webui-docker copied to clipboard

win10 wsl docker error . Traceback (most recent call last): ....webui.py", line 137, in <module> ..../webui.py", line 105, in webui app, local_url, share_url = demo.launch(

Open pengkaiwei opened this issue 2 years ago • 10 comments


webui-docker-auto-1  | Loaded a total of 0 textual inversion embeddings.
webui-docker-auto-1  | Traceback (most recent call last):
webui-docker-auto-1  |   File "/stable-diffusion-webui/repositories/stable-diffusion/../../webui.py", line 137, in <module>
webui-docker-auto-1  |     webui()
webui-docker-auto-1  |   File "/stable-diffusion-webui/repositories/stable-diffusion/../../webui.py", line 105, in webui
webui-docker-auto-1  |     app, local_url, share_url = demo.launch(
webui-docker-auto-1  |   File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1118, in launch
webui-docker-auto-1  |     raise ValueError(

Tks!

Describe the bug

Which UI

docker compose --profile hlky up --build & docker compose --profile auto up --build

.wslconfig [wsl2] memory=32GB

Hardware / Software

  • OS: Windows 10
  • OS version: 21H2
  • WSL version (if applicable): wsl2
  • Docker Version: 20.10.20
  • Docker compose version: v2.12.0
  • Repo version: from master 42cc17da74ed28636905bdccb4eeddf1037fc84c
  • RAM: 128G
  • GPU/VRAM:RTX A4000

Steps to Reproduce

  1. Go to 'docker compose --profile auto up --build'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error
PS D:\wsl\docker-desktop-data\stable-diffusion-webui-docker> docker compose --profile auto up --build
[+] Building 4.6s (31/32)
 => [internal] load build definition from Dockerfile                                                                0.0s
 => => transferring dockerfile: 32B                                                                                 0.0s
 => [internal] load .dockerignore                                                                                   0.0s
 => => transferring context: 2B                                                                                     0.0s
 => resolve image config for docker.io/docker/dockerfile:1                                                          1.9s
 => CACHED docker-image://docker.io/docker/dockerfile:1@sha256:9ba7531bd80fb0a858632727cf7a112fbfd19b17e94c4e84ced  0.0s
 => [internal] load build definition from Dockerfile                                                                0.0s
 => [internal] load .dockerignore                                                                                   0.0s
 => [internal] load metadata for docker.io/library/python:3.10-slim                                                 2.5s
 => [internal] load metadata for docker.io/alpine/git:2.36.2                                                        2.4s
 => [internal] load build context                                                                                   0.0s
 => => transferring context: 118B                                                                                   0.0s
 => [xformers 1/3] FROM docker.io/library/python:3.10-slim@sha256:685b1c2ef40bd3ded77b3abd0965d5c16d19a20469be0ac0  0.0s
 => [download 1/6] FROM docker.io/alpine/git:2.36.2@sha256:ec491c893597b68c92b88023827faa771772cfd5e106b76c713fa5e  0.0s
 => CACHED [stage-2  2/14] RUN pip install torch==1.12.1+cu113 torchvision==0.13.1+cu113 --extra-index-url https:/  0.0s
 => CACHED [stage-2  3/14] RUN apt-get update && apt install fonts-dejavu-core rsync git -y && apt-get clean        0.0s
 => CACHED [stage-2  4/14] RUN <<EOF (git clone https://github.com/AUTOMATIC1111/stable-diffusion-webui.git...)     0.0s
 => CACHED [download 2/6] RUN git clone https://github.com/CompVis/stable-diffusion.git repositories/stable-diffus  0.0s
 => CACHED [download 3/6] RUN git clone https://github.com/sczhou/CodeFormer.git repositories/CodeFormer && cd rep  0.0s
 => CACHED [download 4/6] RUN git clone https://github.com/salesforce/BLIP.git repositories/BLIP && cd repositorie  0.0s
 => CACHED [download 5/6] RUN <<EOF (# because taming-transformers is huge...)                                      0.0s
 => CACHED [download 6/6] RUN git clone https://github.com/crowsonkb/k-diffusion.git repositories/k-diffusion && c  0.0s
 => CACHED [stage-2  5/14] COPY --from=download /git/ /stable-diffusion-webui                                       0.0s
 => CACHED [stage-2  6/14] RUN pip install --prefer-binary --no-cache-dir -r /stable-diffusion-webui/repositories/  0.0s
 => CACHED [stage-2  7/14] RUN apt-get install jq moreutils -y                                                      0.0s
 => CACHED [stage-2  8/14] RUN <<EOF (cd stable-diffusion-webui...)                                                 0.0s
 => CACHED [stage-2  9/14] RUN pip install --prefer-binary --no-cache-dir opencv-python-headless   git+https://git  0.0s
 => CACHED [xformers 2/3] RUN pip install gdown                                                                     0.0s
 => CACHED [xformers 3/3] RUN gdown https://drive.google.com/uc?id=1SqwicrLx1TrG_sbbEoIF_3TUHd4EYSmw -O /wheel.whl  0.0s
 => CACHED [stage-2 10/14] COPY --from=xformers /wheel.whl xformers-0.0.14.dev0-cp310-cp310-linux_x86_64.whl        0.0s
 => CACHED [stage-2 11/14] RUN pip install xformers-0.0.14.dev0-cp310-cp310-linux_x86_64.whl                        0.0s
 => CACHED [stage-2 12/14] COPY . /docker                                                                           0.0s
 => CACHED [stage-2 13/14] RUN <<EOF (chmod +x /docker/mount.sh && python3 /docker/info.py /stable-diffusion-webui  0.0s
 => CACHED [stage-2 14/14] WORKDIR /stable-diffusion-webui/repositories/stable-diffusion                            0.0s
 => exporting to image                                                                                              0.0s
 => => exporting layers                                                                                             0.0s
 => => writing image sha256:cf5907a81e16b2bb487f4b836c44fadb3cec49ac461f99c724e50e5d4a391ee7                        0.0s
 => => naming to docker.io/library/webui-docker-auto                                                                0.0s

Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them
[+] Running 1/0
 - Container webui-docker-auto-1  Created                                                                           0.0s
Attaching to webui-docker-auto-1
webui-docker-auto-1  | + /docker/mount.sh
webui-docker-auto-1  | Mounted .cache
webui-docker-auto-1  | Mounted LDSR
webui-docker-auto-1  | Mounted Hypernetworks
webui-docker-auto-1  | Mounted GFPGAN
webui-docker-auto-1  | Mounted RealESRGAN
webui-docker-auto-1  | Mounted ScuNET
webui-docker-auto-1  | Mounted .cache
webui-docker-auto-1  | Mounted StableDiffusion
webui-docker-auto-1  | Mounted embeddings
webui-docker-auto-1  | Mounted ESRGAN
webui-docker-auto-1  | Mounted config.json
webui-docker-auto-1  | Mounted SwinIR
webui-docker-auto-1  | Mounted ui-config.json
webui-docker-auto-1  | Mounted BSRGAN
webui-docker-auto-1  | Mounted Codeformer
webui-docker-auto-1  | + python3 -u ../../webui.py --listen --port 7860 --ckpt-dir /stable-diffusion-webui/models/Stable-diffusion --allow-code --medvram --xformers
webui-docker-auto-1  | LatentDiffusion: Running in eps-prediction mode
webui-docker-auto-1  | DiffusionWrapper has 859.52 M params.
webui-docker-auto-1  | making attention of type 'vanilla' with 512 in_channels
webui-docker-auto-1  | Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
webui-docker-auto-1  | making attention of type 'vanilla' with 512 in_channels
webui-docker-auto-1  | Loading weights [7460a6fa] from /stable-diffusion-webui/models/Stable-diffusion/model.ckpt
webui-docker-auto-1  | Global Step: 470000
webui-docker-auto-1  | Applying xformers cross attention optimization.
webui-docker-auto-1  | Model loaded.
webui-docker-auto-1  | Loaded a total of 0 textual inversion embeddings.
webui-docker-auto-1  | Traceback (most recent call last):
webui-docker-auto-1  |   File "/stable-diffusion-webui/repositories/stable-diffusion/../../webui.py", line 137, in <module>
webui-docker-auto-1  |     webui()
webui-docker-auto-1  |   File "/stable-diffusion-webui/repositories/stable-diffusion/../../webui.py", line 105, in webui
webui-docker-auto-1  |     app, local_url, share_url = demo.launch(
webui-docker-auto-1  |   File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1118, in launch
webui-docker-auto-1  |     raise ValueError(
webui-docker-auto-1  | ValueError: When running in Google Colab or when localhost is not accessible, a shareable link must be created. Please set share=True.
webui-docker-auto-1 exited with code 1
 D:\wsl\docker-desktop-data\stable-diffusion-webui-docker> docker compose --profile hlky up --build
[+] Building 3.9s (18/18) FINISHED
 => [internal] load build definition from Dockerfile                                                               0.0s
 => => transferring dockerfile: 32B                                                                                0.0s
 => [internal] load .dockerignore                                                                                  0.0s
 => => transferring context: 2B                                                                                    0.0s
 => resolve image config for docker.io/docker/dockerfile:1                                                         2.2s
 => CACHED docker-image://docker.io/docker/dockerfile:1@sha256:9ba7531bd80fb0a858632727cf7a112fbfd19b17e94c4e84ce  0.0s
 => [internal] load .dockerignore                                                                                  0.0s
 => [internal] load build definition from Dockerfile                                                               0.0s
 => [internal] load metadata for docker.io/continuumio/miniconda3:4.12.0                                           1.3s
 => [internal] load build context                                                                                  0.0s
 => => transferring context: 158B                                                                                  0.0s
 => [1/9] FROM docker.io/continuumio/miniconda3:4.12.0@sha256:977263e8d1e476972fddab1c75fe050dd3cd17626390e874448  0.0s
 => CACHED [2/9] RUN conda install python=3.8.5 && conda clean -a -y                                               0.0s
 => CACHED [3/9] RUN conda install pytorch==1.11.0 torchvision==0.12.0 cudatoolkit=11.3 -c pytorch && conda clean  0.0s
 => CACHED [4/9] RUN apt-get update && apt install fonts-dejavu-core rsync gcc -y && apt-get clean                 0.0s
 => CACHED [5/9] RUN <<EOF (git config --global http.postBuffer 1048576000...)                                     0.0s
 => CACHED [6/9] RUN <<EOF (cd stable-diffusion...)                                                                0.0s
 => CACHED [7/9] COPY . /docker/                                                                                   0.0s
 => CACHED [8/9] RUN <<EOF (python /docker/info.py /stable-diffusion/frontend/frontend.py...)                      0.0s
 => CACHED [9/9] WORKDIR /stable-diffusion                                                                         0.0s
 => exporting to image                                                                                             0.0s
 => => exporting layers                                                                                            0.0s
 => => writing image sha256:7079cdcbffbbcffee200782fb56df65f950881a8771cabb4517127a6bb2cbad9                       0.0s
 => => naming to docker.io/library/webui-docker-hlky                                                               0.0s

Use 'docker scan' to run Snyk tests against images to find vulnerabilities and learn how to fix them
[+] Running 1/0
 - Container webui-docker-hlky-1  Created                                                                          0.0s
Attaching to webui-docker-hlky-1
webui-docker-hlky-1  | + /docker/mount.sh
webui-docker-hlky-1  | Mounted .cache
webui-docker-hlky-1  | Mounted LDSR
webui-docker-hlky-1  | Mounted RealESRGAN
webui-docker-hlky-1  | Mounted StableDiffusion
webui-docker-hlky-1  | Mounted GFPGANv1.4.pth
webui-docker-hlky-1  | Mounted GFPGANv1.4.pth
webui-docker-hlky-1  | + /docker/run.sh
webui-docker-hlky-1  | USE_STREAMLIT = 0
webui-docker-hlky-1  | Found GFPGAN
webui-docker-hlky-1  | Found RealESRGAN
webui-docker-hlky-1  | Found LDSR
webui-docker-hlky-1  | Loading model from /data/StableDiffusion/model.ckpt
webui-docker-hlky-1  | Global Step: 470000
webui-docker-hlky-1  | UNet: Running in eps-prediction mode
webui-docker-hlky-1  | CondStage: Running in eps-prediction mode
webui-docker-hlky-1  | FirstStage: Running in eps-prediction mode
webui-docker-hlky-1  | making attention of type 'vanilla' with 512 in_channels
webui-docker-hlky-1  | Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
webui-docker-hlky-1  | making attention of type 'vanilla' with 512 in_channels
webui-docker-hlky-1  | Error: Port: 7860 is not open yet. Please wait, this may take upwards of 60 seconds...
webui-docker-hlky-1  | Rerunning server... use `close()` to stop if you need to change `launch()` parameters.
webui-docker-hlky-1  | ----
webui-docker-hlky-1  | Exception in thread Gradio Server Thread:
webui-docker-hlky-1  | Traceback (most recent call last):
webui-docker-hlky-1  |   File "/opt/conda/lib/python3.8/threading.py", line 932, in _bootstrap_inner
webui-docker-hlky-1  |     self.run()
webui-docker-hlky-1  |   File "scripts/webui.py", line 2613, in run
webui-docker-hlky-1  |     self.demo.launch(**gradio_params)
webui-docker-hlky-1  |   File "/opt/conda/lib/python3.8/site-packages/gradio/blocks.py", line 1107, in launch
webui-docker-hlky-1  |     raise ValueError(
webui-docker-hlky-1  | ValueError: When running in Google Colab or when localhost is not accessible, a shareable link must be created. Please set share=True.
webui-docker-hlky-1 exited with code 0

like this one https://github.com/sd-webui/stable-diffusion-webui/issues/1054

pengkaiwei avatar Oct 21 '22 07:10 pengkaiwei

@pengkaiwei Google Collab was never really a target for this container, but you can try supplying the share parameter anyway, see if it works:

Create a new file called docker-compose.override.yml in the root of the repo and add the following:

services:
   auto:
     environment:
       - CLI_ARGS=--allow-code --xformers --share
    hlky:
     environment:
       - CLI_ARGS=--share

and try running again.

AbdBarho avatar Oct 21 '22 15:10 AbdBarho

@pengkaiwei Google Collab was never really a target for this container, but you can try supplying the share parameter anyway, see if it works:

Create a new file called docker-compose.override.yml in the root of the repo and add the following:

services:
   auto:
     environment:
       - CLI_ARGS=--allow-code --xformers --share
    hlky:
     environment:
       - CLI_ARGS=--share

and try running again.

I'm sorry, I made a mistake. I'm running it on my PC, not Google Collab.


webui-docker-auto-1  | Loaded a total of 0 textual inversion embeddings.
webui-docker-auto-1  | Traceback (most recent call last):
webui-docker-auto-1  |   File "/stable-diffusion-webui/repositories/stable-diffusion/../../webui.py", line 137, in <module>
webui-docker-auto-1  |     webui()
webui-docker-auto-1  |   File "/stable-diffusion-webui/repositories/stable-diffusion/../../webui.py", line 105, in webui
webui-docker-auto-1  |     app, local_url, share_url = demo.launch(
webui-docker-auto-1  |   File "/usr/local/lib/python3.10/site-packages/gradio/blocks.py", line 1118, in launch
webui-docker-auto-1  |     raise ValueError(

pengkaiwei avatar Oct 22 '22 00:10 pengkaiwei

The question here is why can't the container access the network? this is probably related to your docker config, do you have any special docker configurations? is it a new install? are you running the commands from within wsl or from powershell?

AbdBarho avatar Oct 22 '22 08:10 AbdBarho

wsl2 is new , docker windows is new. But the http proxy I set for docker. It is my docker that can access the network. Is it for this reason?

Tks.

This is my C:\user\user.docker\config.json

{
  "credsStore": "desktop",
  "proxies": {
    "default": {
      "httpProxy": "http://host.docker.internal:10811",
      "httpsProxy": "http://host.docker.internal:10811"
    }
  }
}

pengkaiwei avatar Oct 22 '22 16:10 pengkaiwei

why are you using host.docker.internal for the proxy? is your proxy running in a container? (there is another user who was also using a proxy #154)

Maybe this helps? https://github.com/gradio-app/gradio/issues/1747#issuecomment-1181225687 TLDR: add "noProxy": "localhost" to your "defaults" object?

AbdBarho avatar Oct 22 '22 16:10 AbdBarho

Proxy is v2rayn in host windows. docker windows Proxy to host pc , no is 127.0.0.1 , host.docker.internal is to host win ip

https://docs.docker.com/desktop/networking/#there-is-no-docker0-bridge-on-the-host

add

                "noProxy": "localhost"

I will try later, tks!

pengkaiwei avatar Oct 22 '22 17:10 pengkaiwei

why are you using host.docker.internal for the proxy? is your proxy running in a container? (there is another user who was also using a proxy #154)

Maybe this helps? gradio-app/gradio#1747 (comment) TLDR: add "noProxy": "localhost" to your "defaults" object?

I reset the .docker\config.json and deleted the proxy settings.

docker compose --profile hlky up --build Then i can open the webui, but after rendering a picture, it is automatically closed.

Tks.

··· { "credsStore": "desktop" }

···

Mounted .cache
Mounted LDSR
Mounted RealESRGAN
Mounted StableDiffusion
Mounted GFPGANv1.4.pth
Mounted GFPGANv1.4.pth
USE_STREAMLIT = 0
Found GFPGAN
Found RealESRGAN
Found LDSR
Loading model from /data/StableDiffusion/model.ckpt
+ /docker/mount.sh
+ /docker/run.sh
Traceback (most recent call last):
  File "scripts/webui.py", line 530, in <module>
    model,modelCS,modelFS,device, config = load_SD_model()
  File "scripts/webui.py", line 478, in load_SD_model
    sd = load_sd_from_config(opt.ckpt)
  File "scripts/webui.py", line 229, in load_sd_from_config
    pl_sd = torch.load(ckpt, map_location="cpu")
  File "/opt/conda/lib/python3.8/site-packages/torch/serialization.py", line 699, in load
    with _open_file_like(f, 'rb') as opened_file:
  File "/opt/conda/lib/python3.8/site-packages/torch/serialization.py", line 231, in _open_file_like
    return _open_file(name_or_buffer, mode)
  File "/opt/conda/lib/python3.8/site-packages/torch/serialization.py", line 212, in __init__
    super(_open_file, self).__init__(open(name, mode))
FileNotFoundError: [Errno 2] No such file or directory: '/data/StableDiffusion/model.ckpt'
+ /docker/mount.sh
+ /docker/run.sh
Downloading: "https://github.com/DagnyT/hardnet/raw/master/pretrained/train_liberty_with_aug/checkpoint_liberty_with_aug.pth" to /root/.cache/torch/hub/checkpoints/checkpoint_liberty_with_aug.pth
100%|██████████| 5.10M/5.10M [00:02<00:00, 1.98MB/s]
Downloading: 100%|██████████| 939k/939k [00:01<00:00, 655kB/s]  
Downloading: 100%|██████████| 512k/512k [00:01<00:00, 522kB/s] 
Downloading: 100%|██████████| 389/389 [00:00<00:00, 258kB/s]
Downloading: 100%|██████████| 905/905 [00:00<00:00, 621kB/s]
Downloading: 100%|██████████| 4.41k/4.41k [00:00<00:00, 2.29MB/s]
Downloading:  12%|2022-10-21T11:57:59.896442899Z Mounted .cache]  
Mounted LDSR
Mounted RealESRGAN
Mounted StableDiffusion
Mounted GFPGANv1.4.pth
Mounted GFPGANv1.4.pth
USE_STREAMLIT = 0
Found GFPGAN
Found RealESRGAN
Found LDSR
Loading model from /data/StableDiffusion/model.ckpt
Global Step: 470000
UNet: Running in eps-prediction mode
CondStage: Running in eps-prediction mode
Downloading:  93%|█████████▎| 1.48G/1.52022-10-21T12:02:41.367462143Z FirstStage: Running in eps-prediction mode
making attention of type 'vanilla' with 512 in_channels
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
making attention of type 'vanilla' with 512 in_channels
Mounted .cache
Mounted LDSR
Mounted RealESRGAN
Mounted StableDiffusion
Mounted GFPGANv1.4.pth
Mounted GFPGANv1.4.pth
USE_STREAMLIT = 0
Found GFPGAN
Found RealESRGAN
Found LDSR
Loading model from /data/StableDiffusion/model.ckpt
Global Step: 470000
UNet: Running in eps-prediction mode
CondStage: Running in eps-prediction mode
FirstStage: Running in eps-prediction mode
making attention of type 'vanilla' with 512 in_channels
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
making attention of type 'vanilla' with 512 in_channels
Iteration: 1/1
Current prompt: cat
[MemMon] Recording max memory usage...

[MemMon] Stopped recording.

Mounted .cache
Mounted LDSR
Mounted RealESRGAN
Mounted StableDiffusion
Mounted GFPGANv1.4.pth
Mounted GFPGANv1.4.pth
USE_STREAMLIT = 0
Downloading: 100%|██████████| 1.59G/1.59G [03:46<00:00, 7.55MB/s]
Exception in thread Gradio Server Thread:
Traceback (most recent call last):
  File "/opt/conda/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "scripts/webui.py", line 2613, in run
    self.demo.launch(**gradio_params)
  File "/opt/conda/lib/python3.8/site-packages/gradio/blocks.py", line 1107, in launch
    raise ValueError(
ValueError: When running in Google Colab or when localhost is not accessible, a shareable link must be created. Please set share=True.
+ /docker/mount.sh
+ /docker/run.sh
Exception in thread Gradio Server Thread:
Traceback (most recent call last):
  File "/opt/conda/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "scripts/webui.py", line 2613, in run
    self.demo.launch(**gradio_params)
  File "/opt/conda/lib/python3.8/site-packages/gradio/blocks.py", line 1107, in launch
    raise ValueError(
ValueError: When running in Google Colab or when localhost is not accessible, a shareable link must be created. Please set share=True.
100%|██████████| 50/50 [00:14<00:00,  3.52it/s]
+ /docker/mount.sh
+ /docker/run.sh
Found GFPGAN
Found RealESRGAN
Found LDSR
Loading model from /data/StableDiffusion/model.ckpt
Global Step: 470000
UNet: Running in eps-prediction mode
CondStage: Running in eps-prediction mode
FirstStage: Running in eps-prediction mode
making attention of type 'vanilla' with 512 in_channels
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
making attention of type 'vanilla' with 512 in_channels
Exception in thread Gradio Server Thread:
Traceback (most recent call last):
  File "/opt/conda/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "scripts/webui.py", line 2613, in run
    self.demo.launch(**gradio_params)
  File "/opt/conda/lib/python3.8/site-packages/gradio/blocks.py", line 1107, in launch
    raise ValueError(
ValueError: When running in Google Colab or when localhost is not accessible, a shareable link must be created. Please set share=True.
Iteration: 1/1
Current prompt: girl holding a cat
[MemMon] Recording max memory usage...

100%|██████████| 50/50 [00:13<00:00,  3.66it/s]
[MemMon] Stopped recording.

Iteration: 1/1
Current prompt: girl holding a white cat
[MemMon] Recording max memory usage...

 26%|██▌       | 13/50 [00:03<00:10,  3.55it/s]terminate called without an active exception
/docker/run.sh: line 10:    39 Aborted                 python3 -u scripts/webui.py --outdir /output --ckpt /data/StableDiffusion/model.ckpt ${CLI_ARGS}

pengkaiwei avatar Oct 22 '22 17:10 pengkaiwei

this is a very weird problem, my first guess is that the container is killed because it uses too much memory, but you have enough RAM.

what is the output of docker inspect webui-docker-hlky-1? (feel free to remove all personal information)

AbdBarho avatar Oct 22 '22 18:10 AbdBarho

this is a very weird problem, my first guess is that the container is killed because it uses too much memory, but you have enough RAM.

what is the output of docker inspect webui-docker-hlky-1? (feel free to remove all personal information)

This is my docker inspect webui-docker-hlky-1 output . tks

[
    {
        "Id": "70a74822ae4c7fb1720624a04d826fd9becfd36bd704c56a78b07f3cb62ce0ee",
        "Created": "2022-10-21T11:33:39.270329195Z",
        "Path": "/bin/bash",
        "Args": [
            "-ceuxo",
            "pipefail",
            "/docker/mount.sh \u0026\u0026 /docker/run.sh"
        ],
        "State": {
            "Status": "exited",
            "Running": false,
            "Paused": false,
            "Restarting": false,
            "OOMKilled": false,
            "Dead": false,
            "Pid": 0,
            "ExitCode": 134,
            "Error": "",
            "StartedAt": "2022-10-22T17:47:01.405814198Z",
            "FinishedAt": "2022-10-22T17:48:36.803256932Z"
        },
        "Image": "sha256:70ba932d346fc829112237bd3ebf32cf39b5c923708eedff83c1dd2807fee50d",
        "ResolvConfPath": "/var/lib/docker/containers/70a74822ae4c7fb1720624a04d826fd9becfd36bd704c56a78b07f3cb62ce0ee/resolv.conf",
        "HostnamePath": "/var/lib/docker/containers/70a74822ae4c7fb1720624a04d826fd9becfd36bd704c56a78b07f3cb62ce0ee/hostname",
        "HostsPath": "/var/lib/docker/containers/70a74822ae4c7fb1720624a04d826fd9becfd36bd704c56a78b07f3cb62ce0ee/hosts",
        "LogPath": "/var/lib/docker/containers/70a74822ae4c7fb1720624a04d826fd9becfd36bd704c56a78b07f3cb62ce0ee/70a74822ae4c7fb1720624a04d826fd9becfd36bd704c56a78b07f3cb62ce0ee-json.log",
        "Name": "/webui-docker-hlky-1",
        "RestartCount": 0,
        "Driver": "overlay2",
        "Platform": "linux",
        "MountLabel": "",
        "ProcessLabel": "",
        "AppArmorProfile": "",
        "ExecIDs": null,
        "HostConfig": {
            "Binds": [
                "E:\\wsl\\docker-desktop-data\\stable-diffusion-webui-docker\\output:/output:rw",
                "E:\\wsl\\docker-desktop-data\\stable-diffusion-webui-docker\\data:/data:rw"
            ],
            "ContainerIDFile": "",
            "LogConfig": {
                "Type": "json-file",
                "Config": {}
            },
            "NetworkMode": "webui-docker_default",
            "PortBindings": {
                "7860/tcp": [
                    {
                        "HostIp": "",
                        "HostPort": "7860"
                    }
                ]
            },
            "RestartPolicy": {
                "Name": "",
                "MaximumRetryCount": 0
            },
            "AutoRemove": false,
            "VolumeDriver": "",
            "VolumesFrom": null,
            "CapAdd": null,
            "CapDrop": null,
            "CgroupnsMode": "host",
            "Dns": [],
            "DnsOptions": [],
            "DnsSearch": [],
            "ExtraHosts": [],
            "GroupAdd": null,
            "IpcMode": "private",
            "Cgroup": "",
            "Links": null,
            "OomScoreAdj": 0,
            "PidMode": "",
            "Privileged": false,
            "PublishAllPorts": false,
            "ReadonlyRootfs": false,
            "SecurityOpt": null,
            "UTSMode": "",
            "UsernsMode": "",
            "ShmSize": 67108864,
            "Runtime": "runc",
            "ConsoleSize": [
                0,
                0
            ],
            "Isolation": "",
            "CpuShares": 0,
            "Memory": 0,
            "NanoCpus": 0,
            "CgroupParent": "",
            "BlkioWeight": 0,
            "BlkioWeightDevice": null,
            "BlkioDeviceReadBps": null,
            "BlkioDeviceWriteBps": null,
            "BlkioDeviceReadIOps": null,
            "BlkioDeviceWriteIOps": null,
            "CpuPeriod": 0,
            "CpuQuota": 0,
            "CpuRealtimePeriod": 0,
            "CpuRealtimeRuntime": 0,
            "CpusetCpus": "",
            "CpusetMems": "",
            "Devices": null,
            "DeviceCgroupRules": null,
            "DeviceRequests": [
                {
                    "Driver": "nvidia",
                    "Count": 0,
                    "DeviceIDs": [
                        "0"
                    ],
                    "Capabilities": [
                        [
                            "gpu"
                        ]
                    ],
                    "Options": null
                }
            ],
            "KernelMemory": 0,
            "KernelMemoryTCP": 0,
            "MemoryReservation": 0,
            "MemorySwap": 0,
            "MemorySwappiness": null,
            "OomKillDisable": false,
            "PidsLimit": null,
            "Ulimits": null,
            "CpuCount": 0,
            "CpuPercent": 0,
            "IOMaximumIOps": 0,
            "IOMaximumBandwidth": 0,
            "MaskedPaths": [
                "/proc/asound",
                "/proc/acpi",
                "/proc/kcore",
                "/proc/keys",
                "/proc/latency_stats",
                "/proc/timer_list",
                "/proc/timer_stats",
                "/proc/sched_debug",
                "/proc/scsi",
                "/sys/firmware"
            ],
            "ReadonlyPaths": [
                "/proc/bus",
                "/proc/fs",
                "/proc/irq",
                "/proc/sys",
                "/proc/sysrq-trigger"
            ]
        },
        "GraphDriver": {
            "Data": {
                "LowerDir": "/var/lib/docker/overlay2/e41e1ee9072308edfebffd69b36bf546b11a57c355967e81c0db8f7b0a5bcb89-init/diff:/var/lib/docker/overlay2/czb6rgip5z0q3bdh10vzze4nl/diff:/var/lib/docker/overlay2/759mlvv435kq8av1kfyzhtagr/diff:/var/lib/docker/overlay2/zh89zrm07l508st5cly8p8kj1/diff:/var/lib/docker/overlay2/r16vt586colbmrt2ri7tzenoa/diff:/var/lib/docker/overlay2/me4m0phajbl52tbsc6b67hvzb/diff:/var/lib/docker/overlay2/jvcazyxuhxxjva85ordo7t1mw/diff:/var/lib/docker/overlay2/sn54jym8ndy1eon7gxh2wxspn/diff:/var/lib/docker/overlay2/kkafrecfez6gdoi21il82grmd/diff:/var/lib/docker/overlay2/c5998f7dc2a20f31780f0df945b2bb9a931310c03816a5fa8ddccf6ccd3d6444/diff:/var/lib/docker/overlay2/c28c70a26d0915f626f38e9155ebbd53aeb29cad1784d605ca50d1f4292306d3/diff:/var/lib/docker/overlay2/f42fc8e92559d7c7f6127c9a150169df1f0837da5f8a596d64885972695be68c/diff",
                "MergedDir": "/var/lib/docker/overlay2/e41e1ee9072308edfebffd69b36bf546b11a57c355967e81c0db8f7b0a5bcb89/merged",
                "UpperDir": "/var/lib/docker/overlay2/e41e1ee9072308edfebffd69b36bf546b11a57c355967e81c0db8f7b0a5bcb89/diff",
                "WorkDir": "/var/lib/docker/overlay2/e41e1ee9072308edfebffd69b36bf546b11a57c355967e81c0db8f7b0a5bcb89/work"
            },
            "Name": "overlay2"
        },
        "Mounts": [
            {
                "Type": "bind",
                "Source": "E:\\wsl\\docker-desktop-data\\stable-diffusion-webui-docker\\data",
                "Destination": "/data",
                "Mode": "rw",
                "RW": true,
                "Propagation": "rprivate"
            },
            {
                "Type": "bind",
                "Source": "E:\\wsl\\docker-desktop-data\\stable-diffusion-webui-docker\\output",
                "Destination": "/output",
                "Mode": "rw",
                "RW": true,
                "Propagation": "rprivate"
            }
        ],
        "Config": {
            "Hostname": "70a74822ae4c",
            "Domainname": "",
            "User": "",
            "AttachStdin": false,
            "AttachStdout": true,
            "AttachStderr": true,
            "ExposedPorts": {
                "7860/tcp": {}
            },
            "Tty": false,
            "OpenStdin": false,
            "StdinOnce": false,
            "Env": [
                "USE_STREAMLIT=0",
                "HTTP_PROXY=http://host.docker.internal:10811",
                "http_proxy=http://host.docker.internal:10811",
                "HTTPS_PROXY=http://host.docker.internal:10811",
                "https_proxy=http://host.docker.internal:10811",
                "CLI_ARGS=--optimized-turbo",
                "PATH=/opt/conda/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin",
                "LANG=C.UTF-8",
                "LC_ALL=C.UTF-8",
                "DEBIAN_FRONTEND=noninteractive",
                "PYTHONPATH=:",
                "STREAMLIT_SERVER_HEADLESS=true"
            ],
            "Cmd": [
                "/bin/bash",
                "-ceuxo",
                "pipefail",
                "/docker/mount.sh \u0026\u0026 /docker/run.sh"
            ],
            "Image": "webui-docker-hlky",
            "Volumes": {
                "/data": {},
                "/output": {}
            },
            "WorkingDir": "/stable-diffusion",
            "Entrypoint": null,
            "OnBuild": null,
            "Labels": {
                "com.docker.compose.config-hash": "ba50ef8365eba2a27c4d63033cda67783d2f58e0935ed38fab0997f7986e3f16",
                "com.docker.compose.container-number": "1",
                "com.docker.compose.depends_on": "",
                "com.docker.compose.image": "sha256:70ba932d346fc829112237bd3ebf32cf39b5c923708eedff83c1dd2807fee50d",
                "com.docker.compose.oneoff": "False",
                "com.docker.compose.project": "webui-docker",
                "com.docker.compose.project.config_files": "E:\\wsl\\docker-desktop-data\\stable-diffusion-webui-docker\\docker-compose.yml",
                "com.docker.compose.project.working_dir": "E:\\wsl\\docker-desktop-data\\stable-diffusion-webui-docker",
                "com.docker.compose.service": "hlky",
                "com.docker.compose.version": "2.12.0",
                "maintainer": "Anaconda, Inc"
            }
        },
        "NetworkSettings": {
            "Bridge": "",
            "SandboxID": "29b68f33e614417c270958f1624e285fbb386d55e3e4f9bdec54b47558e7ba84",
            "HairpinMode": false,
            "LinkLocalIPv6Address": "",
            "LinkLocalIPv6PrefixLen": 0,
            "Ports": {},
            "SandboxKey": "/var/run/docker/netns/29b68f33e614",
            "SecondaryIPAddresses": null,
            "SecondaryIPv6Addresses": null,
            "EndpointID": "",
            "Gateway": "",
            "GlobalIPv6Address": "",
            "GlobalIPv6PrefixLen": 0,
            "IPAddress": "",
            "IPPrefixLen": 0,
            "IPv6Gateway": "",
            "MacAddress": "",
            "Networks": {
                "webui-docker_default": {
                    "IPAMConfig": null,
                    "Links": null,
                    "Aliases": [
                        "webui-docker-hlky-1",
                        "hlky",
                        "70a74822ae4c"
                    ],
                    "NetworkID": "afa276be98b43664b51377b6b92e9da0cfd8024042b922e8e9c52cac4526154e",
                    "EndpointID": "",
                    "Gateway": "",
                    "IPAddress": "",
                    "IPPrefixLen": 0,
                    "IPv6Gateway": "",
                    "GlobalIPv6Address": "",
                    "GlobalIPv6PrefixLen": 0,
                    "MacAddress": "",
                    "DriverOpts": null
                }
            }
        }
    }
]

pengkaiwei avatar Oct 22 '22 19:10 pengkaiwei

this is a very weird problem, my first guess is that the container is killed because it uses too much memory, but you have enough RAM.

what is the output of docker inspect webui-docker-hlky-1? (feel free to remove all personal information)

I try to delet docker containers Env proxy setting.

seems to be working fine。

            "Env": [
                "HTTP_PROXY=http://host.docker.internal:10811",
                "http_proxy=http://host.docker.internal:10811",
                "HTTPS_PROXY=http://host.docker.internal:10811",
                "https_proxy=http://host.docker.internal:10811",

docker log

Mounted .cache
Mounted LDSR
Mounted RealESRGAN
Mounted StableDiffusion
Mounted GFPGANv1.4.pth
Mounted GFPGANv1.4.pth
USE_STREAMLIT = 0
Found GFPGAN
Found RealESRGAN
Found LDSR
Loading model from /data/StableDiffusion/model.ckpt
Global Step: 470000
UNet: Running in eps-prediction mode
CondStage: Running in eps-prediction mode
FirstStage: Running in eps-prediction mode
making attention of type 'vanilla' with 512 in_channels
Working with z of shape (1, 4, 32, 32) = 4096 dimensions.
making attention of type 'vanilla' with 512 in_channels
Running on local URL:  http://0.0.0.0:7860

To create a public link, set `share=True` in `launch()`.
Iteration: 1/1
Current prompt: A combination of a vacuum cleaner and a scrubber.
[MemMon] Recording max memory usage...

+ /docker/mount.sh
+ /docker/run.sh
100%|██████████| 50/50 [00:14<00:00,  3.53it/s]
[MemMon] Stopped recording.

Iteration: 1/1
Current prompt: A combination of a vacuum cleaner and a scrubber.
[MemMon] Recording max memory usage...

100%|██████████| 50/50 [00:14<00:00,  3.44it/s]
[MemMon] Stopped recording.

Iteration: 1/1
Current prompt: girl holding a white cat
[MemMon] Recording max memory usage...

100%|██████████| 50/50 [00:14<00:00,  3.56it/s]
[MemMon] Stopped recording.

pengkaiwei avatar Oct 22 '22 20:10 pengkaiwei

This issue is stale because it has been open 14 days with no activity. Remove stale label or comment or this will be closed in 7 days.

github-actions[bot] avatar Nov 06 '22 03:11 github-actions[bot]

This issue was closed because it has been stalled for 7 days with no activity.

github-actions[bot] avatar Nov 14 '22 00:11 github-actions[bot]