stable-diffusion-webui
stable-diffusion-webui copied to clipboard
[Bug]: Python exception when /sdapi/v1/options called during outstanding POST /sdapi/v1/txt2img
Is there an existing issue for this?
- [X] I have searched the existing issues and checked the recent builds/commits
What happened?
Everything with this used to be smooth on 8d12a729b8b036cb765cf2d87576d5ae256135c8
, but on the commit listed, I'm hitting issues trying to make concurrent requests to the API. Specifically, it seems to not like it when I call the options endpoint when a POST request to txt2img is already made and waiting.
Traceback (most recent call last):
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 407, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 407, in run_asgi
result = await app( # type: ignore[func-returns-value]
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 78, in __call__
return await self.app(scope, receive, send)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\uvicorn\middleware\proxy_headers.py", line 78, in __call__
return await self.app(scope, receive, send)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\fastapi\applications.py", line 271, in __call__
await super().__call__(scope, receive, send)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\fastapi\applications.py", line 271, in __call__
await super().__call__(scope, receive, send)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\applications.py", line 125, in __call__
await self.middleware_stack(scope, receive, send)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\applications.py", line 125, in __call__
await self.middleware_stack(scope, receive, send)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 184, in __call__
raise exc
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 184, in __call__
raise exc
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 162, in __call__
await self.app(scope, receive, _send)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 162, in __call__
await self.app(scope, receive, _send)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 105, in __call__
await response(scope, receive, send)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\base.py", line 105, in __call__
await response(scope, receive, send)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 266, in __call__
async with anyio.create_task_group() as task_group:
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 266, in __call__
async with anyio.create_task_group() as task_group:
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\anyio\_backends\_asyncio.py", line 662, in __aexit__
raise exceptions[0]
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\anyio\_backends\_asyncio.py", line 662, in __aexit__
raise exceptions[0]
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 269, in wrap
await func()
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 269, in wrap
await func()
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 251, in stream_response
await send(
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\responses.py", line 251, in stream_response
await send(
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 159, in _send
await send(message)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\starlette\middleware\errors.py", line 159, in _send
await send(message)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 494, in send
output = self.conn.send(event)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 494, in send
output = self.conn.send(event)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 468, in send
data_list = self.send_with_data_passthrough(event)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 468, in send
data_list = self.send_with_data_passthrough(event)
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 483, in send_with_data_passthrough
raise LocalProtocolError("Can't send data when our state is ERROR")
File "C:\Users\you\.tensorscale\stable-diffusion-webui\venv\lib\site-packages\h11\_connection.py", line 483, in send_with_data_passthrough
raise LocalProtocolError("Can't send data when our state is ERROR")
h11._util.LocalProtocolError: Can't send data when our state is ERROR
h11._util.LocalProtocolError: Can't send data when our state is ERROR
Steps to reproduce the problem
- Call /sdapi/v1/options
- (Maybe) Call /sdapi/v1/progress after options call
- Start POST to /sdapi/v1/txt2img
Then I would get EOF on the client side
A trace of the problem looks like this, with txt2img resp being unhappy and not writing back at the end
What should have happened?
No exception
Commit where the problem happens
ea9bd9fc7409109adcd61b897abc2c8881161256
What platforms do you use to access the UI ?
Windows
What browsers do you use to access the UI ?
No response
Command Line Arguments
--api
List of extensions
n/a
Console logs
They're a bit of a mess, but the traceback is a good summary.
Additional information
No response
I'm doing some experiments, it's possible this might be happening even without the concurrent call to options endpoint in some cases, but I wanted to jot it down.
if you have any overlapping api calls, behavior is undefined - it simply isn't designed to be a production-level api.
but, having said that, you may have better luck if you start webui with --gradio-queue
param.
Thanks, I'll look into that. It seemed to queue concurrent txt2img requests up previously (8d12a729b8b036cb765cf2d87576d5ae256135c8
), I haven't tested if it still does that. I'll have to see if something changed around that option, or it's unrelated.
@vladmandic @nathanleclaire Any luck solving this issue?
Quite a few people are having it now: https://github.com/AUTOMATIC1111/stable-diffusion-webui/issues/10625
In my case, I don't usually see it these days because I put a mutex around the goroutine's HTTP call. I try pretty deliberately to have very few API calls in flight at once, with the exception of txt2img and progress side by side. Usually see that error when load on it is too high.
@nathanleclaire Thanks for the tip. Do you do any paid consulting? This issue is really giving us trouble.
i just tried on my repo, i cannot reproduce, i had generate run for 100 images and sent both progress and options requests from a second window 2 times per second during the duration of generate.