guidance icon indicating copy to clipboard operation
guidance copied to clipboard

Running guidance with FastAPI

Open FunnyPhantom opened this issue 2 years ago • 4 comments

Hey, This is not necessarily an issue of the guidance itself. But I had some problems running guidance with FastAPI. And thought others might bump into the same problem. FastAPI uses uvicorn as its web server. And by default uvicorn uses its own implementation for async event loop. (uvloop) which is incompatible with the event loop of guidance which is nes_asyncio I believe. This is causing incompatibility and programs could not be executed when they were fastapi was running.

This issue can be fixed by running passing the uvicorn --loop asyncio command.

If you ran to the errors such as:

ValueError: Can't patch loop of type <class 'uvloop.Loop'>

This might be a way to fix it. 👀

FunnyPhantom avatar Jun 04 '23 08:06 FunnyPhantom

Thanks for sharing! Another alternative might be to use async_mode=True calls for guidance, in which case it will use the currently running event loop.

slundberg avatar Jun 04 '23 14:06 slundberg

Is one better than the other? For some reason, I can't get the async_mode=True to work. I get an error about accessing conversation which works just fine in not async mode.

This is python 3.11.3 on mac.

Traceback (most recent call last):
  File "path_to_dir/uvicorn/protocols/http/httptools_impl.py", line 435, in run_asgi
    result = await app(  # type: ignore[func-returns-value]
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "path_to_dir/uvicorn/middleware/proxy_headers.py", line 78, in __call__
    return await self.app(scope, receive, send)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "path_to_dir/fastapi/applications.py", line 282, in __call__
    await super().__call__(scope, receive, send)
  File "path_to_dir/starlette/applications.py", line 122, in __call__
    await self.middleware_stack(scope, receive, send)
  File "path_to_dir/starlette/middleware/errors.py", line 184, in __call__
    raise exc
  File "path_to_dir/starlette/middleware/errors.py", line 162, in __call__
    await self.app(scope, receive, _send)
  File "path_to_dir/starlette/middleware/exceptions.py", line 79, in __call__
    raise exc
  File "path_to_dir/starlette/middleware/exceptions.py", line 68, in __call__
    await self.app(scope, receive, sender)
  File "path_to_dir/fastapi/middleware/asyncexitstack.py", line 20, in __call__
    raise e
  File "path_to_dir/fastapi/middleware/asyncexitstack.py", line 17, in __call__
    await self.app(scope, receive, send)
  File "path_to_dir/starlette/routing.py", line 718, in __call__
    await route.handle(scope, receive, send)
  File "path_to_dir/starlette/routing.py", line 276, in handle
    await self.app(scope, receive, send)
  File "path_to_dir/starlette/routing.py", line 66, in app
    response = await func(request)
               ^^^^^^^^^^^^^^^^^^^
  File "path_to_dir/fastapi/routing.py", line 241, in app
    raw_response = await run_endpoint_function(
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "path_to_dir/fastapi/routing.py", line 167, in run_endpoint_function
    return await dependant.call(**values)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "path/to/projectapp/main.py", line 98, in start_demo
    m.respond_to_message("Hello")
  File "path/to/projectapp/messenger.py", line 62, in respond_to_message
    resp: str = self.model.respond(inbound_msg)
                ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "path/to/projectapp/brain.py", line 98, in respond
    clean_response = result['conversation'][-1]['question'].replace(
                     ~~~~~~^^^^^^^^^^^^^^^^
  File "path_to_dir/guidance/_program.py", line 470, in __getitem__
    return self._variables[key]
           ~~~~~~~~~~~~~~~^^^^^
KeyError: 'conversation'

What am I doing wrong?

bllchmbrs avatar Jul 01 '23 00:07 bllchmbrs

Hi, for anyone passing by and having this error :

Another way to solve it you just have to add to your guidance the argument async_mode=True and use it as async method

model = guidance.llms.Transformers(
    model_name,
    tokenizer=tokenizer,
    torch_dtype=torch.bfloat16,
    device_map="cuda:1",
    max_position_embeddings=4096,
    quantization_config=quantization_config
)
guidance.llm = model
valid_answers = ["yes", "no"]
classifier  = guidance("""Say yes
{{select "answer" options=valid_answers}}</s>""",
async_mode=True
)
output = classifier(valid_answers=valid_answers)
print(output["answer"] == "yes")

jgcb00 avatar Nov 02 '23 17:11 jgcb00

Is this solved? Can it be closed?

ryanpeach avatar Apr 02 '24 18:04 ryanpeach