openai-python icon indicating copy to clipboard operation
openai-python copied to clipboard

APIConnectionError raised when using AsyncOpenAI along FastAPI and uvicorn(uvloop)

Open BadrElfarri opened this issue 11 months ago • 3 comments
trafficstars

Confirm this is an issue with the Python library and not an underlying OpenAI API

  • [X] This is an issue with the Python library

Describe the bug

Exception raised when using AsyncOpenAI with FastAPI and uvicorn issue seems to be compability issues with uvloop that is added through the extras = ["standard"].

fastapi = "0.115.6"
uvicorn = {extras = ["standard"], version = "0.32.1"}
openai = "^1.54.4"

Installing dependencies manually and leaving out uvloop works.

fastapi = "0.115.6"
uvicorn = "0.32.1"
# from extras = ["standard"]
websockets="14.1"
watchfiles="1.0.0"
pyyaml="6.0.2"
python-dotenv="1.0.1"
httptools="0.6.4"
# uvloop="0.21.0" leads to openAI async exception

To Reproduce

  1. pyproject.toml with poetry dep. [tool.poetry.dependencies] python = ">=3.11,<3.12" fastapi = "0.115.6" uvicorn = {extras = ["standard"], version = "0.32.1"} openai = "^1.54.4"

  2. Run app main.py.

  3. Call endpoint http://localhost:50051/test

Exception: File "/Users/badrelfarri/Documents/Code/RevVue/simple-async-openai-assistant/.venv/lib/python3.11/site-packages/openai/_base_client.py", line 1610, in _request raise APIConnectionError(request=request) from err openai.APIConnectionError: Connection error.

Code snippets

main.py


from fastapi import FastAPI
import uvicorn
from openai import AsyncOpenAI

apiKey = ""
app = FastAPI()

@app.get("/test")
async def test():
    systemMessage = "Get a title for the conversation, the title shall not have more than 4 words"
    messages = [
        {"role": "system", "content": systemMessage},
        {
            "role": "user",
            "content": "Is the earth flat",
        },
    ]
    result = await AsyncOpenAI(api_key=apiKey, timeout=30).chat.completions.create(
        model="gpt-4o-mini",
        messages=messages,
    )
    title = result.choices[0].message.content
    return title

if __name__ == '__main__':
    uvicorn.run(app, host="0.0.0.0", port=50051)

OS

macOS

Python version

Python 3.11.4

Library version

openai v1.54.4

BadrElfarri avatar Dec 07 '24 12:12 BadrElfarri

Thanks for the report, does it work if you move the client instantation outside of the request handler?

openai_client = AsyncOpenAI(api_key=apiKey, timeout=30)

@app.get("/test")
async def test():
  ...

RobertCraigie avatar Dec 07 '24 15:12 RobertCraigie

Hey thanks for your quick reply.... Nah does not work either moving client outside.

app = FastAPI()
client = AsyncOpenAI(api_key=apiKey, timeout=30)

@app.get("/test")
async def test():
    systemMessage = "Get a title for the conversation, the title shall not have more than 4 words"
    messages = [
        {"role": "system", "content": systemMessage},
        {
            "role": "user",
            "content": "Is the earth flat",
        },
    ]
    result = await client.chat.completions.create(
        model="gpt-4o-mini",
        messages=messages,
    )
    title = result.choices[0].message.content
    return title

I also went for adding it in the lifespan instead and using it directly from the app object. same issue

from contextlib import asynccontextmanager
from typing import AsyncGenerator, Any

from fastapi import FastAPI
import uvicorn
from openai import AsyncOpenAI

apiKey = ""

@asynccontextmanager
async def lifespan(app: FastAPI) -> AsyncGenerator[None, Any]:
    app.client_openAI = AsyncOpenAI(api_key=apiKey, timeout=30)
    try:
        yield
    finally:
        # Clean up resources if necessary
        pass

app = FastAPI(lifespan=lifespan)

@app.get("/test")
async def test():
    systemMessage = "Get a title for the conversation, the title shall not have more than 4 words"
    messages = [
        {"role": "system", "content": systemMessage},
        {
            "role": "user",
            "content": "Is the earth flat",
        },
    ]
    result = await app.client_openAI.chat.completions.create(
        model="gpt-4o-mini",
        messages=messages,
    )
    title = result.choices[0].message.content
    return title

if __name__ == '__main__':
    uvicorn.run(app, host="0.0.0.0", port=50051)

When removing uvloop from dependencies, everything is working. Not sure if uvloop leads to some compability issues with httpx

BadrElfarri avatar Dec 07 '24 16:12 BadrElfarri

Hi there! I noticed this issue and thought I’d share a few links that discuss similar challenges. They might offer some helpful insights or potential solutions:

Programmer-RD-AI avatar Jan 18 '25 04:01 Programmer-RD-AI